Jan 22 21:12:44 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 22 21:12:44 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 22 21:12:44 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 21:12:44 localhost kernel: BIOS-provided physical RAM map:
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 22 21:12:44 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 22 21:12:44 localhost kernel: NX (Execute Disable) protection: active
Jan 22 21:12:44 localhost kernel: APIC: Static calls initialized
Jan 22 21:12:44 localhost kernel: SMBIOS 2.8 present.
Jan 22 21:12:44 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 22 21:12:44 localhost kernel: Hypervisor detected: KVM
Jan 22 21:12:44 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 22 21:12:44 localhost kernel: kvm-clock: using sched offset of 3216458302 cycles
Jan 22 21:12:44 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 22 21:12:44 localhost kernel: tsc: Detected 2800.000 MHz processor
Jan 22 21:12:44 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 22 21:12:44 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 22 21:12:44 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 22 21:12:44 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 22 21:12:44 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 22 21:12:44 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 22 21:12:44 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 22 21:12:44 localhost kernel: Using GB pages for direct mapping
Jan 22 21:12:44 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 22 21:12:44 localhost kernel: ACPI: Early table checksum verification disabled
Jan 22 21:12:44 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 22 21:12:44 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 21:12:44 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 21:12:44 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 21:12:44 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 22 21:12:44 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 21:12:44 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 21:12:44 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 22 21:12:44 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 22 21:12:44 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 22 21:12:44 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 22 21:12:44 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 22 21:12:44 localhost kernel: No NUMA configuration found
Jan 22 21:12:44 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 22 21:12:44 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 22 21:12:44 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 22 21:12:44 localhost kernel: Zone ranges:
Jan 22 21:12:44 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 22 21:12:44 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 22 21:12:44 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 22 21:12:44 localhost kernel:   Device   empty
Jan 22 21:12:44 localhost kernel: Movable zone start for each node
Jan 22 21:12:44 localhost kernel: Early memory node ranges
Jan 22 21:12:44 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 22 21:12:44 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 22 21:12:44 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 22 21:12:44 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 22 21:12:44 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 22 21:12:44 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 22 21:12:44 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 22 21:12:44 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 22 21:12:44 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 22 21:12:44 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 22 21:12:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 22 21:12:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 22 21:12:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 22 21:12:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 22 21:12:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 22 21:12:44 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 22 21:12:44 localhost kernel: TSC deadline timer available
Jan 22 21:12:44 localhost kernel: CPU topo: Max. logical packages:   8
Jan 22 21:12:44 localhost kernel: CPU topo: Max. logical dies:       8
Jan 22 21:12:44 localhost kernel: CPU topo: Max. dies per package:   1
Jan 22 21:12:44 localhost kernel: CPU topo: Max. threads per core:   1
Jan 22 21:12:44 localhost kernel: CPU topo: Num. cores per package:     1
Jan 22 21:12:44 localhost kernel: CPU topo: Num. threads per package:   1
Jan 22 21:12:44 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 22 21:12:44 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 22 21:12:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 22 21:12:44 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 22 21:12:44 localhost kernel: Booting paravirtualized kernel on KVM
Jan 22 21:12:44 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 22 21:12:44 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 22 21:12:44 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 22 21:12:44 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 22 21:12:44 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 22 21:12:44 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 22 21:12:44 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 21:12:44 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 22 21:12:44 localhost kernel: random: crng init done
Jan 22 21:12:44 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 22 21:12:44 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 22 21:12:44 localhost kernel: Fallback order for Node 0: 0 
Jan 22 21:12:44 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 22 21:12:44 localhost kernel: Policy zone: Normal
Jan 22 21:12:44 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 22 21:12:44 localhost kernel: software IO TLB: area num 8.
Jan 22 21:12:44 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 22 21:12:44 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 22 21:12:44 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 22 21:12:44 localhost kernel: Dynamic Preempt: voluntary
Jan 22 21:12:44 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 22 21:12:44 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 22 21:12:44 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 22 21:12:44 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 22 21:12:44 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 22 21:12:44 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 22 21:12:44 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 22 21:12:44 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 22 21:12:44 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 21:12:44 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 21:12:44 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 21:12:44 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 22 21:12:44 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 22 21:12:44 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 22 21:12:44 localhost kernel: Console: colour VGA+ 80x25
Jan 22 21:12:44 localhost kernel: printk: console [ttyS0] enabled
Jan 22 21:12:44 localhost kernel: ACPI: Core revision 20230331
Jan 22 21:12:44 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 22 21:12:44 localhost kernel: x2apic enabled
Jan 22 21:12:44 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 22 21:12:44 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 22 21:12:44 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 22 21:12:44 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 22 21:12:44 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 22 21:12:44 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 22 21:12:44 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 22 21:12:44 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 22 21:12:44 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 22 21:12:44 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 22 21:12:44 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 22 21:12:44 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 22 21:12:44 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 22 21:12:44 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 22 21:12:44 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 22 21:12:44 localhost kernel: x86/bugs: return thunk changed
Jan 22 21:12:44 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 22 21:12:44 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 22 21:12:44 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 22 21:12:44 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 22 21:12:44 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 22 21:12:44 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 22 21:12:44 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 22 21:12:44 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 22 21:12:44 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 22 21:12:44 localhost kernel: landlock: Up and running.
Jan 22 21:12:44 localhost kernel: Yama: becoming mindful.
Jan 22 21:12:44 localhost kernel: SELinux:  Initializing.
Jan 22 21:12:44 localhost kernel: LSM support for eBPF active
Jan 22 21:12:44 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 21:12:44 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 21:12:44 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 22 21:12:44 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 22 21:12:44 localhost kernel: ... version:                0
Jan 22 21:12:44 localhost kernel: ... bit width:              48
Jan 22 21:12:44 localhost kernel: ... generic registers:      6
Jan 22 21:12:44 localhost kernel: ... value mask:             0000ffffffffffff
Jan 22 21:12:44 localhost kernel: ... max period:             00007fffffffffff
Jan 22 21:12:44 localhost kernel: ... fixed-purpose events:   0
Jan 22 21:12:44 localhost kernel: ... event mask:             000000000000003f
Jan 22 21:12:44 localhost kernel: signal: max sigframe size: 1776
Jan 22 21:12:44 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 22 21:12:44 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 22 21:12:44 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 22 21:12:44 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 22 21:12:44 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 22 21:12:44 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 22 21:12:44 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 22 21:12:44 localhost kernel: node 0 deferred pages initialised in 10ms
Jan 22 21:12:44 localhost kernel: Memory: 7763820K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 22 21:12:44 localhost kernel: devtmpfs: initialized
Jan 22 21:12:44 localhost kernel: x86/mm: Memory block size: 128MB
Jan 22 21:12:44 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 22 21:12:44 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 22 21:12:44 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 22 21:12:44 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 22 21:12:44 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 22 21:12:44 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 22 21:12:44 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 22 21:12:44 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 22 21:12:44 localhost kernel: audit: type=2000 audit(1769116362.771:1): state=initialized audit_enabled=0 res=1
Jan 22 21:12:44 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 22 21:12:44 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 22 21:12:44 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 22 21:12:44 localhost kernel: cpuidle: using governor menu
Jan 22 21:12:44 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 22 21:12:44 localhost kernel: PCI: Using configuration type 1 for base access
Jan 22 21:12:44 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 22 21:12:44 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 22 21:12:44 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 22 21:12:44 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 22 21:12:44 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 22 21:12:44 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 22 21:12:44 localhost kernel: Demotion targets for Node 0: null
Jan 22 21:12:44 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 22 21:12:44 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 22 21:12:44 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 22 21:12:44 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 22 21:12:44 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 22 21:12:44 localhost kernel: ACPI: Interpreter enabled
Jan 22 21:12:44 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 22 21:12:44 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 22 21:12:44 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 22 21:12:44 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 22 21:12:44 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 22 21:12:44 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 22 21:12:44 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [3] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [4] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [5] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [6] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [7] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [8] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [9] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [10] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [11] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [12] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [13] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [14] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [15] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [16] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [17] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [18] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [19] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [20] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [21] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [22] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [23] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [24] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [25] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [26] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [27] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [28] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [29] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [30] registered
Jan 22 21:12:44 localhost kernel: acpiphp: Slot [31] registered
Jan 22 21:12:44 localhost kernel: PCI host bridge to bus 0000:00
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 22 21:12:44 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 22 21:12:44 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 22 21:12:44 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 22 21:12:44 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 22 21:12:44 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 22 21:12:44 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 22 21:12:44 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 22 21:12:44 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 22 21:12:44 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 22 21:12:44 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 22 21:12:44 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 22 21:12:44 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 22 21:12:44 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 22 21:12:44 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 22 21:12:44 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 22 21:12:44 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 22 21:12:44 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 22 21:12:44 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 22 21:12:44 localhost kernel: iommu: Default domain type: Translated
Jan 22 21:12:44 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 22 21:12:44 localhost kernel: SCSI subsystem initialized
Jan 22 21:12:44 localhost kernel: ACPI: bus type USB registered
Jan 22 21:12:44 localhost kernel: usbcore: registered new interface driver usbfs
Jan 22 21:12:44 localhost kernel: usbcore: registered new interface driver hub
Jan 22 21:12:44 localhost kernel: usbcore: registered new device driver usb
Jan 22 21:12:44 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 22 21:12:44 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 22 21:12:44 localhost kernel: PTP clock support registered
Jan 22 21:12:44 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 22 21:12:44 localhost kernel: NetLabel: Initializing
Jan 22 21:12:44 localhost kernel: NetLabel:  domain hash size = 128
Jan 22 21:12:44 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 22 21:12:44 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 22 21:12:44 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 22 21:12:44 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 22 21:12:44 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 22 21:12:44 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 22 21:12:44 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 22 21:12:44 localhost kernel: vgaarb: loaded
Jan 22 21:12:44 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 22 21:12:44 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 22 21:12:44 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 22 21:12:44 localhost kernel: pnp: PnP ACPI init
Jan 22 21:12:44 localhost kernel: pnp 00:03: [dma 2]
Jan 22 21:12:44 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 22 21:12:44 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 22 21:12:44 localhost kernel: NET: Registered PF_INET protocol family
Jan 22 21:12:44 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 22 21:12:44 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 22 21:12:44 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 22 21:12:44 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 22 21:12:44 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 22 21:12:44 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 22 21:12:44 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 22 21:12:44 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 21:12:44 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 21:12:44 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 22 21:12:44 localhost kernel: NET: Registered PF_XDP protocol family
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 22 21:12:44 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 22 21:12:44 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 22 21:12:44 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 22 21:12:44 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 71562 usecs
Jan 22 21:12:44 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 22 21:12:44 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 22 21:12:44 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 22 21:12:44 localhost kernel: ACPI: bus type thunderbolt registered
Jan 22 21:12:44 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 22 21:12:44 localhost kernel: Initialise system trusted keyrings
Jan 22 21:12:44 localhost kernel: Key type blacklist registered
Jan 22 21:12:44 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 22 21:12:44 localhost kernel: zbud: loaded
Jan 22 21:12:44 localhost kernel: integrity: Platform Keyring initialized
Jan 22 21:12:44 localhost kernel: integrity: Machine keyring initialized
Jan 22 21:12:44 localhost kernel: Freeing initrd memory: 87956K
Jan 22 21:12:44 localhost kernel: NET: Registered PF_ALG protocol family
Jan 22 21:12:44 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 22 21:12:44 localhost kernel: Key type asymmetric registered
Jan 22 21:12:44 localhost kernel: Asymmetric key parser 'x509' registered
Jan 22 21:12:44 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 22 21:12:44 localhost kernel: io scheduler mq-deadline registered
Jan 22 21:12:44 localhost kernel: io scheduler kyber registered
Jan 22 21:12:44 localhost kernel: io scheduler bfq registered
Jan 22 21:12:44 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 22 21:12:44 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 22 21:12:44 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 22 21:12:44 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 22 21:12:44 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 22 21:12:44 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 22 21:12:44 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 22 21:12:44 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 22 21:12:44 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 22 21:12:44 localhost kernel: Non-volatile memory driver v1.3
Jan 22 21:12:44 localhost kernel: rdac: device handler registered
Jan 22 21:12:44 localhost kernel: hp_sw: device handler registered
Jan 22 21:12:44 localhost kernel: emc: device handler registered
Jan 22 21:12:44 localhost kernel: alua: device handler registered
Jan 22 21:12:44 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 22 21:12:44 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 22 21:12:44 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 22 21:12:44 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 22 21:12:44 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 22 21:12:44 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 22 21:12:44 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 22 21:12:44 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 22 21:12:44 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 22 21:12:44 localhost kernel: hub 1-0:1.0: USB hub found
Jan 22 21:12:44 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 22 21:12:44 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 22 21:12:44 localhost kernel: usbserial: USB Serial support registered for generic
Jan 22 21:12:44 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 22 21:12:44 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 22 21:12:44 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 22 21:12:44 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 22 21:12:44 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 22 21:12:44 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 22 21:12:44 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 22 21:12:44 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 22 21:12:44 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-22T21:12:43 UTC (1769116363)
Jan 22 21:12:44 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 22 21:12:44 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 22 21:12:44 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 22 21:12:44 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 22 21:12:44 localhost kernel: usbcore: registered new interface driver usbhid
Jan 22 21:12:44 localhost kernel: usbhid: USB HID core driver
Jan 22 21:12:44 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 22 21:12:44 localhost kernel: Initializing XFRM netlink socket
Jan 22 21:12:44 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 22 21:12:44 localhost kernel: Segment Routing with IPv6
Jan 22 21:12:44 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 22 21:12:44 localhost kernel: mpls_gso: MPLS GSO support
Jan 22 21:12:44 localhost kernel: IPI shorthand broadcast: enabled
Jan 22 21:12:44 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 22 21:12:44 localhost kernel: AES CTR mode by8 optimization enabled
Jan 22 21:12:44 localhost kernel: sched_clock: Marking stable (1231006049, 142295680)->(1483688719, -110386990)
Jan 22 21:12:44 localhost kernel: registered taskstats version 1
Jan 22 21:12:44 localhost kernel: Loading compiled-in X.509 certificates
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 22 21:12:44 localhost kernel: Demotion targets for Node 0: null
Jan 22 21:12:44 localhost kernel: page_owner is disabled
Jan 22 21:12:44 localhost kernel: Key type .fscrypt registered
Jan 22 21:12:44 localhost kernel: Key type fscrypt-provisioning registered
Jan 22 21:12:44 localhost kernel: Key type big_key registered
Jan 22 21:12:44 localhost kernel: Key type encrypted registered
Jan 22 21:12:44 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 22 21:12:44 localhost kernel: Loading compiled-in module X.509 certificates
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 21:12:44 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 22 21:12:44 localhost kernel: ima: No architecture policies found
Jan 22 21:12:44 localhost kernel: evm: Initialising EVM extended attributes:
Jan 22 21:12:44 localhost kernel: evm: security.selinux
Jan 22 21:12:44 localhost kernel: evm: security.SMACK64 (disabled)
Jan 22 21:12:44 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 22 21:12:44 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 22 21:12:44 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 22 21:12:44 localhost kernel: evm: security.apparmor (disabled)
Jan 22 21:12:44 localhost kernel: evm: security.ima
Jan 22 21:12:44 localhost kernel: evm: security.capability
Jan 22 21:12:44 localhost kernel: evm: HMAC attrs: 0x1
Jan 22 21:12:44 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 22 21:12:44 localhost kernel: Running certificate verification RSA selftest
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 22 21:12:44 localhost kernel: Running certificate verification ECDSA selftest
Jan 22 21:12:44 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 22 21:12:44 localhost kernel: clk: Disabling unused clocks
Jan 22 21:12:44 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 22 21:12:44 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 22 21:12:44 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 22 21:12:44 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 22 21:12:44 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 22 21:12:44 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 22 21:12:44 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 22 21:12:44 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 22 21:12:44 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 22 21:12:44 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 22 21:12:44 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 22 21:12:44 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 22 21:12:44 localhost kernel: Run /init as init process
Jan 22 21:12:44 localhost kernel:   with arguments:
Jan 22 21:12:44 localhost kernel:     /init
Jan 22 21:12:44 localhost kernel:   with environment:
Jan 22 21:12:44 localhost kernel:     HOME=/
Jan 22 21:12:44 localhost kernel:     TERM=linux
Jan 22 21:12:44 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 22 21:12:44 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 21:12:44 localhost systemd[1]: Detected virtualization kvm.
Jan 22 21:12:44 localhost systemd[1]: Detected architecture x86-64.
Jan 22 21:12:44 localhost systemd[1]: Running in initrd.
Jan 22 21:12:44 localhost systemd[1]: No hostname configured, using default hostname.
Jan 22 21:12:44 localhost systemd[1]: Hostname set to <localhost>.
Jan 22 21:12:44 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 22 21:12:44 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 22 21:12:44 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 21:12:44 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 22 21:12:44 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 22 21:12:44 localhost systemd[1]: Reached target Local File Systems.
Jan 22 21:12:44 localhost systemd[1]: Reached target Path Units.
Jan 22 21:12:44 localhost systemd[1]: Reached target Slice Units.
Jan 22 21:12:44 localhost systemd[1]: Reached target Swaps.
Jan 22 21:12:44 localhost systemd[1]: Reached target Timer Units.
Jan 22 21:12:44 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 22 21:12:44 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 22 21:12:44 localhost systemd[1]: Listening on Journal Socket.
Jan 22 21:12:44 localhost systemd[1]: Listening on udev Control Socket.
Jan 22 21:12:44 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 22 21:12:44 localhost systemd[1]: Reached target Socket Units.
Jan 22 21:12:44 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 22 21:12:44 localhost systemd[1]: Starting Journal Service...
Jan 22 21:12:44 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 21:12:44 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 22 21:12:44 localhost systemd[1]: Starting Create System Users...
Jan 22 21:12:44 localhost systemd[1]: Starting Setup Virtual Console...
Jan 22 21:12:44 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 22 21:12:44 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 22 21:12:44 localhost systemd[1]: Finished Create System Users.
Jan 22 21:12:44 localhost systemd-journald[308]: Journal started
Jan 22 21:12:44 localhost systemd-journald[308]: Runtime Journal (/run/log/journal/148e2083b3dc4db0b189a79547a2be98) is 8.0M, max 153.6M, 145.6M free.
Jan 22 21:12:44 localhost systemd-sysusers[313]: Creating group 'users' with GID 100.
Jan 22 21:12:44 localhost systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Jan 22 21:12:44 localhost systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 22 21:12:44 localhost systemd[1]: Started Journal Service.
Jan 22 21:12:44 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 22 21:12:44 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 21:12:44 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 21:12:44 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 21:12:44 localhost systemd[1]: Finished Setup Virtual Console.
Jan 22 21:12:44 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 22 21:12:44 localhost systemd[1]: Starting dracut cmdline hook...
Jan 22 21:12:44 localhost dracut-cmdline[329]: dracut-9 dracut-057-102.git20250818.el9
Jan 22 21:12:44 localhost dracut-cmdline[329]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 21:12:44 localhost systemd[1]: Finished dracut cmdline hook.
Jan 22 21:12:44 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 22 21:12:44 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 22 21:12:44 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 22 21:12:44 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 22 21:12:44 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 22 21:12:44 localhost kernel: RPC: Registered udp transport module.
Jan 22 21:12:44 localhost kernel: RPC: Registered tcp transport module.
Jan 22 21:12:44 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 22 21:12:44 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 22 21:12:44 localhost rpc.statd[445]: Version 2.5.4 starting
Jan 22 21:12:44 localhost rpc.statd[445]: Initializing NSM state
Jan 22 21:12:44 localhost rpc.idmapd[450]: Setting log level to 0
Jan 22 21:12:44 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 22 21:12:44 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 21:12:44 localhost systemd-udevd[463]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 21:12:44 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 21:12:44 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 22 21:12:44 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 22 21:12:45 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 22 21:12:45 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 22 21:12:45 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 22 21:12:45 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 22 21:12:45 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 21:12:45 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 22 21:12:45 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 21:12:45 localhost systemd[1]: Reached target Network.
Jan 22 21:12:45 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 21:12:45 localhost systemd[1]: Starting dracut initqueue hook...
Jan 22 21:12:45 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 22 21:12:45 localhost systemd-udevd[495]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 21:12:45 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 22 21:12:45 localhost kernel:  vda: vda1
Jan 22 21:12:45 localhost kernel: libata version 3.00 loaded.
Jan 22 21:12:45 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 21:12:45 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 22 21:12:45 localhost kernel: scsi host0: ata_piix
Jan 22 21:12:45 localhost kernel: scsi host1: ata_piix
Jan 22 21:12:45 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 22 21:12:45 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 22 21:12:45 localhost systemd[1]: Reached target Initrd Root Device.
Jan 22 21:12:45 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 22 21:12:45 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 22 21:12:45 localhost systemd[1]: Reached target System Initialization.
Jan 22 21:12:45 localhost systemd[1]: Reached target Basic System.
Jan 22 21:12:45 localhost kernel: ata1: found unknown device (class 0)
Jan 22 21:12:45 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 22 21:12:45 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 22 21:12:45 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 22 21:12:45 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 22 21:12:45 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 22 21:12:45 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 22 21:12:45 localhost systemd[1]: Finished dracut initqueue hook.
Jan 22 21:12:45 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 21:12:45 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 22 21:12:45 localhost systemd[1]: Reached target Remote File Systems.
Jan 22 21:12:45 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 22 21:12:45 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 22 21:12:45 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 22 21:12:45 localhost systemd-fsck[557]: /usr/sbin/fsck.xfs: XFS file system.
Jan 22 21:12:45 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 21:12:45 localhost systemd[1]: Mounting /sysroot...
Jan 22 21:12:46 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 22 21:12:46 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 22 21:12:46 localhost kernel: XFS (vda1): Ending clean mount
Jan 22 21:12:46 localhost systemd[1]: Mounted /sysroot.
Jan 22 21:12:46 localhost systemd[1]: Reached target Initrd Root File System.
Jan 22 21:12:46 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 22 21:12:46 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 22 21:12:46 localhost systemd[1]: Reached target Initrd File Systems.
Jan 22 21:12:46 localhost systemd[1]: Reached target Initrd Default Target.
Jan 22 21:12:46 localhost systemd[1]: Starting dracut mount hook...
Jan 22 21:12:46 localhost systemd[1]: Finished dracut mount hook.
Jan 22 21:12:46 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 22 21:12:46 localhost rpc.idmapd[450]: exiting on signal 15
Jan 22 21:12:46 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 22 21:12:46 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 22 21:12:46 localhost systemd[1]: Stopped target Network.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Timer Units.
Jan 22 21:12:46 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 22 21:12:46 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Basic System.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Path Units.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Remote File Systems.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Slice Units.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Socket Units.
Jan 22 21:12:46 localhost systemd[1]: Stopped target System Initialization.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Local File Systems.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Swaps.
Jan 22 21:12:46 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped dracut mount hook.
Jan 22 21:12:46 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 22 21:12:46 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 22 21:12:46 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 22 21:12:46 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 22 21:12:46 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 22 21:12:46 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 22 21:12:46 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 22 21:12:46 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 22 21:12:46 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 22 21:12:46 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 22 21:12:46 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 22 21:12:46 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 22 21:12:46 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Closed udev Control Socket.
Jan 22 21:12:46 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Closed udev Kernel Socket.
Jan 22 21:12:46 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 22 21:12:46 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 22 21:12:46 localhost systemd[1]: Starting Cleanup udev Database...
Jan 22 21:12:46 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 22 21:12:46 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 22 21:12:46 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Stopped Create System Users.
Jan 22 21:12:46 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 22 21:12:46 localhost systemd[1]: Finished Cleanup udev Database.
Jan 22 21:12:46 localhost systemd[1]: Reached target Switch Root.
Jan 22 21:12:46 localhost systemd[1]: Starting Switch Root...
Jan 22 21:12:46 localhost systemd[1]: Switching root.
Jan 22 21:12:46 localhost systemd-journald[308]: Journal stopped
Jan 22 21:12:47 localhost systemd-journald[308]: Received SIGTERM from PID 1 (systemd).
Jan 22 21:12:47 localhost kernel: audit: type=1404 audit(1769116366.606:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 22 21:12:47 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 21:12:47 localhost kernel: SELinux:  policy capability open_perms=1
Jan 22 21:12:47 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 21:12:47 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 22 21:12:47 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 21:12:47 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 21:12:47 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 21:12:47 localhost kernel: audit: type=1403 audit(1769116366.749:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 22 21:12:47 localhost systemd[1]: Successfully loaded SELinux policy in 146.614ms.
Jan 22 21:12:47 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.988ms.
Jan 22 21:12:47 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 21:12:47 localhost systemd[1]: Detected virtualization kvm.
Jan 22 21:12:47 localhost systemd[1]: Detected architecture x86-64.
Jan 22 21:12:47 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:12:47 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 22 21:12:47 localhost systemd[1]: Stopped Switch Root.
Jan 22 21:12:47 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 22 21:12:47 localhost systemd[1]: Created slice Slice /system/getty.
Jan 22 21:12:47 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 22 21:12:47 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 22 21:12:47 localhost systemd[1]: Created slice User and Session Slice.
Jan 22 21:12:47 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 21:12:47 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 22 21:12:47 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 22 21:12:47 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 22 21:12:47 localhost systemd[1]: Stopped target Switch Root.
Jan 22 21:12:47 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 22 21:12:47 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 22 21:12:47 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 22 21:12:47 localhost systemd[1]: Reached target Path Units.
Jan 22 21:12:47 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 22 21:12:47 localhost systemd[1]: Reached target Slice Units.
Jan 22 21:12:47 localhost systemd[1]: Reached target Swaps.
Jan 22 21:12:47 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 22 21:12:47 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 22 21:12:47 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 22 21:12:47 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 22 21:12:47 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 22 21:12:47 localhost systemd[1]: Listening on udev Control Socket.
Jan 22 21:12:47 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 22 21:12:47 localhost systemd[1]: Mounting Huge Pages File System...
Jan 22 21:12:47 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 22 21:12:47 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 22 21:12:47 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 22 21:12:47 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 21:12:47 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 22 21:12:47 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 22 21:12:47 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 22 21:12:47 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 22 21:12:47 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 22 21:12:47 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 22 21:12:47 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 22 21:12:47 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 22 21:12:47 localhost systemd[1]: Stopped Journal Service.
Jan 22 21:12:47 localhost kernel: fuse: init (API version 7.37)
Jan 22 21:12:47 localhost systemd[1]: Starting Journal Service...
Jan 22 21:12:47 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 21:12:47 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 22 21:12:47 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 21:12:47 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 22 21:12:47 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 22 21:12:47 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 22 21:12:47 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 22 21:12:47 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 22 21:12:47 localhost systemd[1]: Mounted Huge Pages File System.
Jan 22 21:12:47 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 22 21:12:47 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 22 21:12:47 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 22 21:12:47 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 22 21:12:47 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 21:12:47 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 22 21:12:47 localhost systemd-journald[679]: Journal started
Jan 22 21:12:47 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 21:12:47 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 22 21:12:47 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 22 21:12:47 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 22 21:12:47 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 22 21:12:47 localhost systemd[1]: Started Journal Service.
Jan 22 21:12:47 localhost kernel: ACPI: bus type drm_connector registered
Jan 22 21:12:47 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 22 21:12:47 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 22 21:12:47 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 22 21:12:47 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 22 21:12:47 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 22 21:12:47 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 22 21:12:47 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 22 21:12:47 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 22 21:12:47 localhost systemd[1]: Mounting FUSE Control File System...
Jan 22 21:12:47 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 21:12:47 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 22 21:12:47 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 22 21:12:47 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 22 21:12:47 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 22 21:12:47 localhost systemd[1]: Starting Create System Users...
Jan 22 21:12:47 localhost systemd[1]: Mounted FUSE Control File System.
Jan 22 21:12:47 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 21:12:47 localhost systemd-journald[679]: Received client request to flush runtime journal.
Jan 22 21:12:47 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 22 21:12:47 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 22 21:12:47 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 21:12:47 localhost systemd[1]: Finished Create System Users.
Jan 22 21:12:47 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 22 21:12:47 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 22 21:12:47 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 21:12:47 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 22 21:12:47 localhost systemd[1]: Reached target Local File Systems.
Jan 22 21:12:47 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 22 21:12:47 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 22 21:12:47 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 22 21:12:47 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 22 21:12:47 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 22 21:12:47 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 22 21:12:47 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 21:12:47 localhost bootctl[699]: Couldn't find EFI system partition, skipping.
Jan 22 21:12:47 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 22 21:12:47 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 21:12:47 localhost systemd[1]: Starting Security Auditing Service...
Jan 22 21:12:47 localhost systemd[1]: Starting RPC Bind...
Jan 22 21:12:47 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 22 21:12:47 localhost auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 22 21:12:47 localhost auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 22 21:12:47 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 22 21:12:47 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 22 21:12:47 localhost systemd[1]: Started RPC Bind.
Jan 22 21:12:47 localhost augenrules[710]: /sbin/augenrules: No change
Jan 22 21:12:47 localhost augenrules[725]: No rules
Jan 22 21:12:47 localhost augenrules[725]: enabled 1
Jan 22 21:12:47 localhost augenrules[725]: failure 1
Jan 22 21:12:47 localhost augenrules[725]: pid 705
Jan 22 21:12:47 localhost augenrules[725]: rate_limit 0
Jan 22 21:12:47 localhost augenrules[725]: backlog_limit 8192
Jan 22 21:12:47 localhost augenrules[725]: lost 0
Jan 22 21:12:47 localhost augenrules[725]: backlog 4
Jan 22 21:12:47 localhost augenrules[725]: backlog_wait_time 60000
Jan 22 21:12:47 localhost augenrules[725]: backlog_wait_time_actual 0
Jan 22 21:12:47 localhost augenrules[725]: enabled 1
Jan 22 21:12:47 localhost augenrules[725]: failure 1
Jan 22 21:12:47 localhost augenrules[725]: pid 705
Jan 22 21:12:47 localhost augenrules[725]: rate_limit 0
Jan 22 21:12:47 localhost augenrules[725]: backlog_limit 8192
Jan 22 21:12:47 localhost augenrules[725]: lost 0
Jan 22 21:12:47 localhost augenrules[725]: backlog 4
Jan 22 21:12:47 localhost augenrules[725]: backlog_wait_time 60000
Jan 22 21:12:47 localhost augenrules[725]: backlog_wait_time_actual 0
Jan 22 21:12:47 localhost augenrules[725]: enabled 1
Jan 22 21:12:47 localhost augenrules[725]: failure 1
Jan 22 21:12:47 localhost augenrules[725]: pid 705
Jan 22 21:12:47 localhost augenrules[725]: rate_limit 0
Jan 22 21:12:47 localhost augenrules[725]: backlog_limit 8192
Jan 22 21:12:47 localhost augenrules[725]: lost 0
Jan 22 21:12:47 localhost augenrules[725]: backlog 4
Jan 22 21:12:47 localhost augenrules[725]: backlog_wait_time 60000
Jan 22 21:12:47 localhost augenrules[725]: backlog_wait_time_actual 0
Jan 22 21:12:47 localhost systemd[1]: Started Security Auditing Service.
Jan 22 21:12:47 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 22 21:12:47 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 22 21:12:47 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 22 21:12:47 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 21:12:47 localhost systemd[1]: Starting Update is Completed...
Jan 22 21:12:47 localhost systemd[1]: Finished Update is Completed.
Jan 22 21:12:47 localhost systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 21:12:47 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 21:12:47 localhost systemd[1]: Reached target System Initialization.
Jan 22 21:12:47 localhost systemd[1]: Started dnf makecache --timer.
Jan 22 21:12:48 localhost systemd[1]: Started Daily rotation of log files.
Jan 22 21:12:48 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 22 21:12:48 localhost systemd[1]: Reached target Timer Units.
Jan 22 21:12:48 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 22 21:12:48 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 22 21:12:48 localhost systemd[1]: Reached target Socket Units.
Jan 22 21:12:48 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 22 21:12:48 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 21:12:48 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 22 21:12:48 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 22 21:12:48 localhost systemd-udevd[739]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 21:12:48 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 21:12:48 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 22 21:12:48 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 22 21:12:48 localhost systemd[1]: Reached target Basic System.
Jan 22 21:12:48 localhost dbus-broker-lau[766]: Ready
Jan 22 21:12:48 localhost systemd[1]: Starting NTP client/server...
Jan 22 21:12:48 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 22 21:12:48 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 22 21:12:48 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 22 21:12:48 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 22 21:12:48 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 22 21:12:48 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 22 21:12:48 localhost chronyd[786]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 21:12:48 localhost chronyd[786]: Loaded 0 symmetric keys
Jan 22 21:12:48 localhost chronyd[786]: Using right/UTC timezone to obtain leap second data
Jan 22 21:12:48 localhost chronyd[786]: Loaded seccomp filter (level 2)
Jan 22 21:12:48 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 22 21:12:48 localhost systemd[1]: Started irqbalance daemon.
Jan 22 21:12:48 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 22 21:12:48 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 21:12:48 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 21:12:48 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 21:12:48 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 22 21:12:48 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 22 21:12:48 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 22 21:12:48 localhost systemd[1]: Starting User Login Management...
Jan 22 21:12:48 localhost systemd[1]: Started NTP client/server.
Jan 22 21:12:48 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 22 21:12:48 localhost kernel: kvm_amd: TSC scaling supported
Jan 22 21:12:48 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 22 21:12:48 localhost kernel: kvm_amd: Nested Paging enabled
Jan 22 21:12:48 localhost kernel: kvm_amd: LBR virtualization supported
Jan 22 21:12:48 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 22 21:12:48 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 22 21:12:48 localhost kernel: Console: switching to colour dummy device 80x25
Jan 22 21:12:48 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 22 21:12:48 localhost kernel: [drm] features: -context_init
Jan 22 21:12:48 localhost kernel: [drm] number of scanouts: 1
Jan 22 21:12:48 localhost kernel: [drm] number of cap sets: 0
Jan 22 21:12:48 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 22 21:12:48 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 22 21:12:48 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 22 21:12:48 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 22 21:12:48 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 22 21:12:48 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 22 21:12:48 localhost systemd-logind[801]: New seat seat0.
Jan 22 21:12:48 localhost systemd-logind[801]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 21:12:48 localhost systemd-logind[801]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 21:12:48 localhost systemd[1]: Started User Login Management.
Jan 22 21:12:48 localhost iptables.init[788]: iptables: Applying firewall rules: [  OK  ]
Jan 22 21:12:48 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 22 21:12:48 localhost cloud-init[843]: Cloud-init v. 24.4-8.el9 running 'init-local' at Thu, 22 Jan 2026 21:12:48 +0000. Up 6.23 seconds.
Jan 22 21:12:48 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 22 21:12:48 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 22 21:12:48 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpgon1mavg.mount: Deactivated successfully.
Jan 22 21:12:48 localhost systemd[1]: Starting Hostname Service...
Jan 22 21:12:48 localhost systemd[1]: Started Hostname Service.
Jan 22 21:12:48 np0005592765.novalocal systemd-hostnamed[857]: Hostname set to <np0005592765.novalocal> (static)
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Reached target Preparation for Network.
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Starting Network Manager...
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1460] NetworkManager (version 1.54.3-2.el9) is starting... (boot:7bdd0997-5020-422e-9e39-85d77ba7ab4a)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1465] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1547] manager[0x561b969be000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1589] hostname: hostname: using hostnamed
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1589] hostname: static hostname changed from (none) to "np0005592765.novalocal"
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1594] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1705] manager[0x561b969be000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1706] manager[0x561b969be000]: rfkill: WWAN hardware radio set enabled
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1789] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1791] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1792] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1792] manager: Networking is enabled by state file
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1794] settings: Loaded settings plugin: keyfile (internal)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1804] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1830] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1842] dhcp: init: Using DHCP client 'internal'
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1845] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1894] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1903] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1912] device (lo): Activation: starting connection 'lo' (b467b8bc-34e9-40e6-be6e-e31b90d2564c)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1922] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1925] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Started Network Manager.
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1971] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1984] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1987] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1990] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.1993] device (eth0): carrier: link connected
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Reached target Network.
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2016] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2025] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2033] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2039] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2039] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2043] manager: NetworkManager state is now CONNECTING
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2044] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2071] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2074] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2134] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2143] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2167] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2173] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2175] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2180] device (lo): Activation: successful, device activated.
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2195] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2197] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2200] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2203] device (eth0): Activation: successful, device activated.
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2211] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 21:12:49 np0005592765.novalocal NetworkManager[861]: <info>  [1769116369.2215] manager: startup complete
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Reached target NFS client services.
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Reached target Remote File Systems.
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 22 21:12:49 np0005592765.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: Cloud-init v. 24.4-8.el9 running 'init' at Thu, 22 Jan 2026 21:12:49 +0000. Up 7.15 seconds.
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |  eth0  | True |         38.102.83.50         | 255.255.255.0 | global | fa:16:3e:3d:88:74 |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |  eth0  | True | fe80::f816:3eff:fe3d:8874/64 |       .       |  link  | fa:16:3e:3d:88:74 |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 22 21:12:49 np0005592765.novalocal cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 21:12:50 np0005592765.novalocal useradd[991]: new group: name=cloud-user, GID=1001
Jan 22 21:12:50 np0005592765.novalocal useradd[991]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 22 21:12:50 np0005592765.novalocal useradd[991]: add 'cloud-user' to group 'adm'
Jan 22 21:12:50 np0005592765.novalocal useradd[991]: add 'cloud-user' to group 'systemd-journal'
Jan 22 21:12:50 np0005592765.novalocal useradd[991]: add 'cloud-user' to shadow group 'adm'
Jan 22 21:12:50 np0005592765.novalocal useradd[991]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Generating public/private rsa key pair.
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: The key fingerprint is:
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: SHA256:ARUv8Twbc+dr/szaX6omyajY4MDbodPlDtHfF2ay9XQ root@np0005592765.novalocal
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: The key's randomart image is:
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: +---[RSA 3072]----+
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |      ..+.       |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |       . =       |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |        o B . .  |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |     .   o * o   |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |    . . S o = o E|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: | .   ... . * + o |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |  o.+o  .oo.. + .|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |  .*.*. . +..o =.|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |  o.+.+.   o..+oB|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: +----[SHA256]-----+
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Generating public/private ecdsa key pair.
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: The key fingerprint is:
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: SHA256:fYPhn+G8gDTm4xLHkpNFFR4ezwVO5Mm3fEoH1lOkQNg root@np0005592765.novalocal
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: The key's randomart image is:
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: +---[ECDSA 256]---+
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |         .=**...o|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |        .o.XE+ o.|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |       .  + B =..|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |        .o o + o.|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |       =S + + + o|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |      *+oo = = + |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |       =+ . = .  |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |      .. . . .   |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |       ..   .    |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: +----[SHA256]-----+
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Generating public/private ed25519 key pair.
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: The key fingerprint is:
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: SHA256:Wde3sc8ikTswMFqSntIW0Zzwmp1CNxOjEWX2djOcqBA root@np0005592765.novalocal
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: The key's randomart image is:
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: +--[ED25519 256]--+
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |      E**.       |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |       X++ o o   |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |      B O = B ...|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |     + % @ o + .+|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |    . X S o o  o |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |     o .   o o ..|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |            + . o|
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |             o . |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: |                 |
Jan 22 21:12:50 np0005592765.novalocal cloud-init[925]: +----[SHA256]-----+
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Reached target Network is Online.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Starting System Logging Service...
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 22 21:12:50 np0005592765.novalocal sm-notify[1007]: Version 2.5.4 starting
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Starting Permit User Sessions...
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 22 21:12:50 np0005592765.novalocal sshd[1009]: Server listening on 0.0.0.0 port 22.
Jan 22 21:12:50 np0005592765.novalocal sshd[1009]: Server listening on :: port 22.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Finished Permit User Sessions.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Started Command Scheduler.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Started Getty on tty1.
Jan 22 21:12:50 np0005592765.novalocal rsyslogd[1008]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1008" x-info="https://www.rsyslog.com"] start
Jan 22 21:12:50 np0005592765.novalocal rsyslogd[1008]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 22 21:12:50 np0005592765.novalocal crond[1012]: (CRON) STARTUP (1.5.7)
Jan 22 21:12:50 np0005592765.novalocal crond[1012]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 22 21:12:50 np0005592765.novalocal crond[1012]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 77% if used.)
Jan 22 21:12:50 np0005592765.novalocal crond[1012]: (CRON) INFO (running with inotify support)
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Reached target Login Prompts.
Jan 22 21:12:50 np0005592765.novalocal systemd[1]: Started System Logging Service.
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: Reached target Multi-User System.
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 22 21:12:51 np0005592765.novalocal rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 21:12:51 np0005592765.novalocal kdumpctl[1020]: kdump: No kdump initial ramdisk found.
Jan 22 21:12:51 np0005592765.novalocal kdumpctl[1020]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1126]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Thu, 22 Jan 2026 21:12:51 +0000. Up 8.84 seconds.
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 22 21:12:51 np0005592765.novalocal dracut[1269]: dracut-057-102.git20250818.el9
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1287]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Thu, 22 Jan 2026 21:12:51 +0000. Up 9.24 seconds.
Jan 22 21:12:51 np0005592765.novalocal dracut[1271]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1309]: #############################################################
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1313]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1312]: Unable to negotiate with 38.102.83.114 port 48808: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1319]: 256 SHA256:fYPhn+G8gDTm4xLHkpNFFR4ezwVO5Mm3fEoH1lOkQNg root@np0005592765.novalocal (ECDSA)
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1326]: 256 SHA256:Wde3sc8ikTswMFqSntIW0Zzwmp1CNxOjEWX2djOcqBA root@np0005592765.novalocal (ED25519)
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1332]: 3072 SHA256:ARUv8Twbc+dr/szaX6omyajY4MDbodPlDtHfF2ay9XQ root@np0005592765.novalocal (RSA)
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1324]: Connection reset by 38.102.83.114 port 48812 [preauth]
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1336]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1340]: #############################################################
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1335]: Unable to negotiate with 38.102.83.114 port 48826: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1353]: Unable to negotiate with 38.102.83.114 port 48838: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 22 21:12:51 np0005592765.novalocal cloud-init[1287]: Cloud-init v. 24.4-8.el9 finished at Thu, 22 Jan 2026 21:12:51 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.41 seconds
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1356]: Connection reset by 38.102.83.114 port 48848 [preauth]
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1297]: Connection closed by 38.102.83.114 port 48792 [preauth]
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1364]: Connection closed by 38.102.83.114 port 48850 [preauth]
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1368]: Unable to negotiate with 38.102.83.114 port 48858: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 22 21:12:51 np0005592765.novalocal sshd-session[1370]: Unable to negotiate with 38.102.83.114 port 48868: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 22 21:12:51 np0005592765.novalocal systemd[1]: Reached target Cloud-init target.
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: memstrack is not available
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 21:12:52 np0005592765.novalocal dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: memstrack is not available
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: *** Including module: systemd ***
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: *** Including module: fips ***
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: *** Including module: systemd-initrd ***
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: *** Including module: i18n ***
Jan 22 21:12:53 np0005592765.novalocal dracut[1271]: *** Including module: drm ***
Jan 22 21:12:54 np0005592765.novalocal chronyd[786]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 22 21:12:54 np0005592765.novalocal chronyd[786]: System clock TAI offset set to 37 seconds
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]: *** Including module: prefixdevname ***
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]: *** Including module: kernel-modules ***
Jan 22 21:12:54 np0005592765.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]: *** Including module: kernel-modules-extra ***
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]: *** Including module: qemu ***
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]: *** Including module: fstab-sys ***
Jan 22 21:12:54 np0005592765.novalocal dracut[1271]: *** Including module: rootfs-block ***
Jan 22 21:12:55 np0005592765.novalocal dracut[1271]: *** Including module: terminfo ***
Jan 22 21:12:55 np0005592765.novalocal dracut[1271]: *** Including module: udev-rules ***
Jan 22 21:12:55 np0005592765.novalocal dracut[1271]: Skipping udev rule: 91-permissions.rules
Jan 22 21:12:55 np0005592765.novalocal dracut[1271]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 22 21:12:55 np0005592765.novalocal dracut[1271]: *** Including module: virtiofs ***
Jan 22 21:12:55 np0005592765.novalocal dracut[1271]: *** Including module: dracut-systemd ***
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]: *** Including module: usrmount ***
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]: *** Including module: base ***
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]: *** Including module: fs-lib ***
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]: *** Including module: kdumpbase ***
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:   microcode_ctl module: mangling fw_dir
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel" is ignored
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 22 21:12:56 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]: *** Including module: openssl ***
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]: *** Including module: shutdown ***
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]: *** Including module: squash ***
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]: *** Including modules done ***
Jan 22 21:12:57 np0005592765.novalocal dracut[1271]: *** Installing kernel module dependencies ***
Jan 22 21:12:58 np0005592765.novalocal dracut[1271]: *** Installing kernel module dependencies done ***
Jan 22 21:12:58 np0005592765.novalocal dracut[1271]: *** Resolving executable dependencies ***
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 35 affinity is now unmanaged
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 33 affinity is now unmanaged
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 31 affinity is now unmanaged
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 28 affinity is now unmanaged
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 34 affinity is now unmanaged
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 32 affinity is now unmanaged
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 30 affinity is now unmanaged
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 22 21:12:58 np0005592765.novalocal irqbalance[796]: IRQ 29 affinity is now unmanaged
Jan 22 21:12:59 np0005592765.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 21:12:59 np0005592765.novalocal dracut[1271]: *** Resolving executable dependencies done ***
Jan 22 21:12:59 np0005592765.novalocal dracut[1271]: *** Generating early-microcode cpio image ***
Jan 22 21:12:59 np0005592765.novalocal dracut[1271]: *** Store current command line parameters ***
Jan 22 21:12:59 np0005592765.novalocal dracut[1271]: Stored kernel commandline:
Jan 22 21:12:59 np0005592765.novalocal dracut[1271]: No dracut internal kernel commandline stored in the initramfs
Jan 22 21:13:00 np0005592765.novalocal dracut[1271]: *** Install squash loader ***
Jan 22 21:13:01 np0005592765.novalocal dracut[1271]: *** Squashing the files inside the initramfs ***
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: *** Squashing the files inside the initramfs done ***
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: *** Hardlinking files ***
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: Mode:           real
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: Files:          50
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: Linked:         0 files
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: Compared:       0 xattrs
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: Compared:       0 files
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: Saved:          0 B
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: Duration:       0.001045 seconds
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: *** Hardlinking files done ***
Jan 22 21:13:02 np0005592765.novalocal dracut[1271]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 22 21:13:03 np0005592765.novalocal kdumpctl[1020]: kdump: kexec: loaded kdump kernel
Jan 22 21:13:03 np0005592765.novalocal kdumpctl[1020]: kdump: Starting kdump: [OK]
Jan 22 21:13:03 np0005592765.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 22 21:13:03 np0005592765.novalocal systemd[1]: Startup finished in 1.632s (kernel) + 2.627s (initrd) + 16.935s (userspace) = 21.195s.
Jan 22 21:13:19 np0005592765.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 21:13:50 np0005592765.novalocal sshd-session[4307]: Accepted publickey for zuul from 38.102.83.114 port 52796 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 22 21:13:50 np0005592765.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 22 21:13:50 np0005592765.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 22 21:13:50 np0005592765.novalocal systemd-logind[801]: New session 1 of user zuul.
Jan 22 21:13:50 np0005592765.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 22 21:13:50 np0005592765.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Queued start job for default target Main User Target.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Created slice User Application Slice.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Reached target Paths.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Reached target Timers.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Starting D-Bus User Message Bus Socket...
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Starting Create User's Volatile Files and Directories...
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Listening on D-Bus User Message Bus Socket.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Reached target Sockets.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Finished Create User's Volatile Files and Directories.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Reached target Basic System.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Reached target Main User Target.
Jan 22 21:13:50 np0005592765.novalocal systemd[4311]: Startup finished in 130ms.
Jan 22 21:13:50 np0005592765.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 22 21:13:50 np0005592765.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 22 21:13:50 np0005592765.novalocal sshd-session[4307]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:13:51 np0005592765.novalocal python3[4393]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:14:00 np0005592765.novalocal chronyd[786]: Selected source 206.108.0.132 (2.centos.pool.ntp.org)
Jan 22 21:14:01 np0005592765.novalocal python3[4421]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:14:10 np0005592765.novalocal python3[4479]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:14:11 np0005592765.novalocal python3[4519]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 22 21:14:14 np0005592765.novalocal python3[4545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMeBm6NiIvVxetJ4FCT6X6eqh4y+XGlxa2O9SnCM87Rvd3hWwicZru82vfPJxoAbseaSfdfZfa5Oaf5dhIru0B1DVPR+Y21uBaSUcO1K8p5tC2tzE5lkAIy/kRVSwMYfC5dEpExJw+20uHDLhVtsOAMKhThwm/XsS/9yy8cQG4ADfn2gl1nOfLWdeDsMspTuwYbF5uu9ANML8AymvhI9P057RIvVPP3XADkxcthWmeoY31Rv8JlJGXn9R9yr9bXjaXt1WnmADIMvCooPBtjoHFkzec9uGiq2KbPxYijX4nkBK7VCl+z7mv0qda4ub0iuJwaz74mccey9rlhgqsbW68VK8P5ok/O5AYo7MrOUCGbNrU9JgXrMTk2Iu7TMLxDuT0VdEs8Q2UG15+ASQiyG6zYkOCJ02VjwHLQyQ73PJXkt2gFQHX1iBFOvYo2QMz4/MD4kAU/TCfKhXngyzI4H7PhTJJ3yrwNrOT4XzOSSfMhvBlszNp33r0FR4w0Oh/ssE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:14 np0005592765.novalocal python3[4569]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:14 np0005592765.novalocal python3[4668]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:14:15 np0005592765.novalocal python3[4739]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769116454.601803-251-69242112548370/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=e08bc08cd036420dba87014ca8d05e85_id_rsa follow=False checksum=f2ae46c25e13ca69ac26d9c69f9985b01dd0fed0 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:15 np0005592765.novalocal python3[4862]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:14:16 np0005592765.novalocal python3[4933]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769116455.5862935-306-65148852671223/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=e08bc08cd036420dba87014ca8d05e85_id_rsa.pub follow=False checksum=9abcadb7ea942e24eaeef4fb6992f6baca8c0b83 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:17 np0005592765.novalocal python3[4981]: ansible-ping Invoked with data=pong
Jan 22 21:14:18 np0005592765.novalocal python3[5005]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:14:20 np0005592765.novalocal python3[5063]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 22 21:14:21 np0005592765.novalocal python3[5095]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:21 np0005592765.novalocal python3[5119]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:22 np0005592765.novalocal python3[5143]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:22 np0005592765.novalocal python3[5167]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:22 np0005592765.novalocal python3[5191]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:22 np0005592765.novalocal python3[5215]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:24 np0005592765.novalocal sudo[5239]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmhijlrpkdmyvkpjsismlvxifeeabhsb ; /usr/bin/python3'
Jan 22 21:14:24 np0005592765.novalocal sudo[5239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:24 np0005592765.novalocal python3[5241]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:24 np0005592765.novalocal sudo[5239]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:24 np0005592765.novalocal sudo[5317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnnshzyaxhgciuruysasuzuvmkdnokmo ; /usr/bin/python3'
Jan 22 21:14:24 np0005592765.novalocal sudo[5317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:25 np0005592765.novalocal python3[5319]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:14:25 np0005592765.novalocal sudo[5317]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:25 np0005592765.novalocal sudo[5390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqzwzjxdpbzcdxyxkyqvjsoolnzogkaq ; /usr/bin/python3'
Jan 22 21:14:25 np0005592765.novalocal sudo[5390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:25 np0005592765.novalocal python3[5392]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769116464.66633-31-103923903660807/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:25 np0005592765.novalocal sudo[5390]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:26 np0005592765.novalocal python3[5440]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:26 np0005592765.novalocal python3[5464]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:26 np0005592765.novalocal python3[5488]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:27 np0005592765.novalocal python3[5512]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:27 np0005592765.novalocal python3[5536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:27 np0005592765.novalocal python3[5560]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:28 np0005592765.novalocal python3[5584]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:28 np0005592765.novalocal python3[5608]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:28 np0005592765.novalocal python3[5632]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:28 np0005592765.novalocal python3[5656]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:29 np0005592765.novalocal python3[5680]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:29 np0005592765.novalocal python3[5704]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:29 np0005592765.novalocal python3[5728]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:30 np0005592765.novalocal python3[5752]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:30 np0005592765.novalocal python3[5776]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:30 np0005592765.novalocal python3[5800]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:30 np0005592765.novalocal python3[5824]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:31 np0005592765.novalocal python3[5848]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:31 np0005592765.novalocal python3[5872]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:31 np0005592765.novalocal python3[5896]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:32 np0005592765.novalocal python3[5920]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:32 np0005592765.novalocal python3[5944]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:32 np0005592765.novalocal python3[5968]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:33 np0005592765.novalocal python3[5992]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:33 np0005592765.novalocal python3[6016]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:33 np0005592765.novalocal python3[6040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:14:35 np0005592765.novalocal sudo[6064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnqzphugnafyevpnntqwdjwokoflgrru ; /usr/bin/python3'
Jan 22 21:14:35 np0005592765.novalocal sudo[6064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:36 np0005592765.novalocal python3[6066]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 21:14:36 np0005592765.novalocal systemd[1]: Starting Time & Date Service...
Jan 22 21:14:36 np0005592765.novalocal systemd[1]: Started Time & Date Service.
Jan 22 21:14:36 np0005592765.novalocal systemd-timedated[6068]: Changed time zone to 'UTC' (UTC).
Jan 22 21:14:36 np0005592765.novalocal sudo[6064]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:36 np0005592765.novalocal sudo[6095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avnsmjwriunidaoxqnufdlqyppxkceyk ; /usr/bin/python3'
Jan 22 21:14:36 np0005592765.novalocal sudo[6095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:36 np0005592765.novalocal python3[6097]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:36 np0005592765.novalocal sudo[6095]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:37 np0005592765.novalocal python3[6173]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:14:37 np0005592765.novalocal python3[6244]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769116476.9011865-251-210301106189644/source _original_basename=tmpazye9zow follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:38 np0005592765.novalocal python3[6344]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:14:38 np0005592765.novalocal python3[6415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769116477.816407-301-70414090755490/source _original_basename=tmpcos4hobn follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:39 np0005592765.novalocal sudo[6515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiraqpiaqyhmwmfrotjieqcvobqewnor ; /usr/bin/python3'
Jan 22 21:14:39 np0005592765.novalocal sudo[6515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:39 np0005592765.novalocal python3[6517]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:14:39 np0005592765.novalocal sudo[6515]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:39 np0005592765.novalocal sudo[6588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfnjudcwxkddcafdtrswgrojxoeqlnxh ; /usr/bin/python3'
Jan 22 21:14:39 np0005592765.novalocal sudo[6588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:39 np0005592765.novalocal python3[6590]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769116478.9556139-381-211501995843142/source _original_basename=tmprxty4681 follow=False checksum=9dc2039529c0f35ddba9b5f747501467f5135778 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:39 np0005592765.novalocal sudo[6588]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:40 np0005592765.novalocal python3[6638]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:14:40 np0005592765.novalocal python3[6664]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:14:40 np0005592765.novalocal sudo[6742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aulidggtngmwewykfbjyxdukxzemlltl ; /usr/bin/python3'
Jan 22 21:14:40 np0005592765.novalocal sudo[6742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:41 np0005592765.novalocal python3[6744]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:14:41 np0005592765.novalocal sudo[6742]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:41 np0005592765.novalocal sudo[6815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyvqbvjayboeldwtjckhevdnbgayphdu ; /usr/bin/python3'
Jan 22 21:14:41 np0005592765.novalocal sudo[6815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:41 np0005592765.novalocal python3[6817]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769116480.710287-451-10451980186114/source _original_basename=tmplqp7v26i follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:14:41 np0005592765.novalocal sudo[6815]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:41 np0005592765.novalocal sudo[6866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgzcyjzsiknhzorqrcgmuncuelefgvsb ; /usr/bin/python3'
Jan 22 21:14:41 np0005592765.novalocal sudo[6866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:14:41 np0005592765.novalocal python3[6868]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-a621-2aaa-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:14:41 np0005592765.novalocal sudo[6866]: pam_unix(sudo:session): session closed for user root
Jan 22 21:14:42 np0005592765.novalocal python3[6896]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-a621-2aaa-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 22 21:14:43 np0005592765.novalocal python3[6924]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:15:02 np0005592765.novalocal sudo[6948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srguwysfkjnnrrnsphoxaqswngtpwpdt ; /usr/bin/python3'
Jan 22 21:15:02 np0005592765.novalocal sudo[6948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:15:02 np0005592765.novalocal python3[6950]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:15:02 np0005592765.novalocal sudo[6948]: pam_unix(sudo:session): session closed for user root
Jan 22 21:15:06 np0005592765.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 22 21:15:42 np0005592765.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 22 21:15:42 np0005592765.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9325] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 21:15:42 np0005592765.novalocal systemd-udevd[6953]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9525] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9555] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9558] device (eth1): carrier: link connected
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9560] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9566] policy: auto-activating connection 'Wired connection 1' (eb7d8611-2061-33cb-9530-bc24d43f8bde)
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9571] device (eth1): Activation: starting connection 'Wired connection 1' (eb7d8611-2061-33cb-9530-bc24d43f8bde)
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9573] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9576] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9581] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:15:42 np0005592765.novalocal NetworkManager[861]: <info>  [1769116542.9586] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:15:43 np0005592765.novalocal python3[6980]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-9ef2-05fc-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:15:50 np0005592765.novalocal sudo[7058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aonlykoymeemtpwzjhfdfjxjdtziudzf ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 21:15:50 np0005592765.novalocal sudo[7058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:15:50 np0005592765.novalocal python3[7060]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:15:50 np0005592765.novalocal sudo[7058]: pam_unix(sudo:session): session closed for user root
Jan 22 21:15:51 np0005592765.novalocal sudo[7131]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skvvsyiilaguopnohnnofrypzlwrxoip ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 21:15:51 np0005592765.novalocal sudo[7131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:15:51 np0005592765.novalocal python3[7133]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769116550.6437793-104-5595430318953/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=a5efeed4f72f1e825f74ea350b455ca7f0b569d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:15:51 np0005592765.novalocal sudo[7131]: pam_unix(sudo:session): session closed for user root
Jan 22 21:15:51 np0005592765.novalocal sudo[7181]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrwegdyyxohnuljitebhqfjvuptlirmc ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 21:15:51 np0005592765.novalocal sudo[7181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:15:52 np0005592765.novalocal python3[7183]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Stopping Network Manager...
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2264] caught SIGTERM, shutting down normally.
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2273] dhcp4 (eth0): canceled DHCP transaction
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2273] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2274] dhcp4 (eth0): state changed no lease
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2276] manager: NetworkManager state is now CONNECTING
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2378] dhcp4 (eth1): canceled DHCP transaction
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2378] dhcp4 (eth1): state changed no lease
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[861]: <info>  [1769116552.2448] exiting (success)
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Stopped Network Manager.
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: NetworkManager.service: Consumed 1.476s CPU time, 10.0M memory peak.
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Starting Network Manager...
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.2997] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:7bdd0997-5020-422e-9e39-85d77ba7ab4a)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.3006] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.3061] manager[0x55d1c2a1e000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Starting Hostname Service...
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Started Hostname Service.
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4250] hostname: hostname: using hostnamed
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4253] hostname: static hostname changed from (none) to "np0005592765.novalocal"
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4261] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4271] manager[0x55d1c2a1e000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4272] manager[0x55d1c2a1e000]: rfkill: WWAN hardware radio set enabled
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4323] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4323] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4324] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4324] manager: Networking is enabled by state file
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4329] settings: Loaded settings plugin: keyfile (internal)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4334] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4376] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4393] dhcp: init: Using DHCP client 'internal'
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4398] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4406] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4415] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4427] device (lo): Activation: starting connection 'lo' (b467b8bc-34e9-40e6-be6e-e31b90d2564c)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4442] device (eth0): carrier: link connected
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4450] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4458] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4458] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4469] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4480] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4492] device (eth1): carrier: link connected
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4498] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4507] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (eb7d8611-2061-33cb-9530-bc24d43f8bde) (indicated)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4507] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4516] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4529] device (eth1): Activation: starting connection 'Wired connection 1' (eb7d8611-2061-33cb-9530-bc24d43f8bde)
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Started Network Manager.
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4539] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4546] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4549] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4552] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4554] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4569] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4572] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4576] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4590] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4598] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4601] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4612] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4616] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4636] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4642] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4647] device (lo): Activation: successful, device activated.
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4673] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4681] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 21:15:52 np0005592765.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4758] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4812] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4814] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4818] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4821] device (eth0): Activation: successful, device activated.
Jan 22 21:15:52 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116552.4829] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 21:15:52 np0005592765.novalocal sudo[7181]: pam_unix(sudo:session): session closed for user root
Jan 22 21:15:52 np0005592765.novalocal python3[7267]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-9ef2-05fc-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:16:02 np0005592765.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 21:16:22 np0005592765.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3440] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 21:16:37 np0005592765.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 21:16:37 np0005592765.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3717] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3720] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3732] device (eth1): Activation: successful, device activated.
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3742] manager: startup complete
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3745] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <warn>  [1769116597.3755] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3766] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3908] dhcp4 (eth1): canceled DHCP transaction
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3909] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3909] dhcp4 (eth1): state changed no lease
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3926] policy: auto-activating connection 'ci-private-network' (8e0b80b2-e493-5867-80c6-016d3ab3d59e)
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3932] device (eth1): Activation: starting connection 'ci-private-network' (8e0b80b2-e493-5867-80c6-016d3ab3d59e)
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3933] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3940] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3948] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.3957] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.4015] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.4018] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:16:37 np0005592765.novalocal NetworkManager[7195]: <info>  [1769116597.4027] device (eth1): Activation: successful, device activated.
Jan 22 21:16:45 np0005592765.novalocal systemd[4311]: Starting Mark boot as successful...
Jan 22 21:16:45 np0005592765.novalocal systemd[4311]: Finished Mark boot as successful.
Jan 22 21:16:47 np0005592765.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 21:16:52 np0005592765.novalocal sshd-session[4320]: Received disconnect from 38.102.83.114 port 52796:11: disconnected by user
Jan 22 21:16:52 np0005592765.novalocal sshd-session[4320]: Disconnected from user zuul 38.102.83.114 port 52796
Jan 22 21:16:52 np0005592765.novalocal sshd-session[4307]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:16:52 np0005592765.novalocal systemd-logind[801]: Session 1 logged out. Waiting for processes to exit.
Jan 22 21:17:45 np0005592765.novalocal sshd-session[7296]: Accepted publickey for zuul from 38.102.83.114 port 48632 ssh2: RSA SHA256:5hPaKzzgux2WRINR8nUX2B9zeLH3Zuh4lEFbJIE1uhE
Jan 22 21:17:45 np0005592765.novalocal systemd-logind[801]: New session 3 of user zuul.
Jan 22 21:17:45 np0005592765.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 22 21:17:45 np0005592765.novalocal sshd-session[7296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:17:45 np0005592765.novalocal sudo[7375]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djkhjmbaxdghbfgfatmmeznlxsahxdcu ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 21:17:45 np0005592765.novalocal sudo[7375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:17:45 np0005592765.novalocal python3[7377]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:17:45 np0005592765.novalocal sudo[7375]: pam_unix(sudo:session): session closed for user root
Jan 22 21:17:46 np0005592765.novalocal sudo[7448]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywzbbnuhzabzzkrgbhxkjgwpbrssombf ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 22 21:17:46 np0005592765.novalocal sudo[7448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:17:46 np0005592765.novalocal python3[7450]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769116665.5345914-365-50116966072737/source _original_basename=tmpvljj1lr1 follow=False checksum=dd64a3b92ffda34adfd43ce03aa34a851854b0c5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:17:46 np0005592765.novalocal sudo[7448]: pam_unix(sudo:session): session closed for user root
Jan 22 21:17:50 np0005592765.novalocal sshd-session[7299]: Connection closed by 38.102.83.114 port 48632
Jan 22 21:17:50 np0005592765.novalocal sshd-session[7296]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:17:50 np0005592765.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 22 21:17:50 np0005592765.novalocal systemd-logind[801]: Session 3 logged out. Waiting for processes to exit.
Jan 22 21:17:50 np0005592765.novalocal systemd-logind[801]: Removed session 3.
Jan 22 21:18:18 np0005592765.novalocal chronyd[786]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 22 21:19:45 np0005592765.novalocal systemd[4311]: Created slice User Background Tasks Slice.
Jan 22 21:19:45 np0005592765.novalocal systemd[4311]: Starting Cleanup of User's Temporary Files and Directories...
Jan 22 21:19:45 np0005592765.novalocal systemd[4311]: Finished Cleanup of User's Temporary Files and Directories.
Jan 22 21:23:50 np0005592765.novalocal sshd-session[7481]: Accepted publickey for zuul from 38.102.83.114 port 37454 ssh2: RSA SHA256:5hPaKzzgux2WRINR8nUX2B9zeLH3Zuh4lEFbJIE1uhE
Jan 22 21:23:50 np0005592765.novalocal systemd-logind[801]: New session 4 of user zuul.
Jan 22 21:23:50 np0005592765.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 22 21:23:50 np0005592765.novalocal sshd-session[7481]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:23:50 np0005592765.novalocal sudo[7508]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpfsfrktpiqgylfrfpewifombvuxuekt ; /usr/bin/python3'
Jan 22 21:23:50 np0005592765.novalocal sudo[7508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:50 np0005592765.novalocal python3[7510]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-c9eb-29a0-000000000ca8-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:23:50 np0005592765.novalocal sudo[7508]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:50 np0005592765.novalocal sudo[7536]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojyjxhakwzrmdnvmndcerxvvsluyclzk ; /usr/bin/python3'
Jan 22 21:23:50 np0005592765.novalocal sudo[7536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:50 np0005592765.novalocal python3[7538]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:23:50 np0005592765.novalocal sudo[7536]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:51 np0005592765.novalocal sudo[7563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogpgothcdikwucmbvtakohbpgzefhiod ; /usr/bin/python3'
Jan 22 21:23:51 np0005592765.novalocal sudo[7563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:51 np0005592765.novalocal python3[7565]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:23:51 np0005592765.novalocal sudo[7563]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:51 np0005592765.novalocal sudo[7589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctwigsehulzelmfckrnynxgcjfnaugno ; /usr/bin/python3'
Jan 22 21:23:51 np0005592765.novalocal sudo[7589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:51 np0005592765.novalocal python3[7591]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:23:51 np0005592765.novalocal sudo[7589]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:51 np0005592765.novalocal sudo[7615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrbfxbuzubamsrplywsmrddfoibfjbgb ; /usr/bin/python3'
Jan 22 21:23:51 np0005592765.novalocal sudo[7615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:51 np0005592765.novalocal python3[7617]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:23:51 np0005592765.novalocal sudo[7615]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:52 np0005592765.novalocal sudo[7641]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcsamdhssavaxcnldpsxoeecbjpufifv ; /usr/bin/python3'
Jan 22 21:23:52 np0005592765.novalocal sudo[7641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:52 np0005592765.novalocal python3[7643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:23:52 np0005592765.novalocal sudo[7641]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:52 np0005592765.novalocal sudo[7719]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hktrperhjfxuoizwtcgttmrdupddtngk ; /usr/bin/python3'
Jan 22 21:23:52 np0005592765.novalocal sudo[7719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:52 np0005592765.novalocal python3[7721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:23:52 np0005592765.novalocal sudo[7719]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:53 np0005592765.novalocal sudo[7792]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecdowsqapauochwdbpfeambhcvymiqei ; /usr/bin/python3'
Jan 22 21:23:53 np0005592765.novalocal sudo[7792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:53 np0005592765.novalocal python3[7794]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769117032.4746392-366-129925619955968/source _original_basename=tmp2jr8l3dy follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:23:53 np0005592765.novalocal sudo[7792]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:53 np0005592765.novalocal sudo[7842]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dusfhwcauwronvrtmystaphdzbmkuwyt ; /usr/bin/python3'
Jan 22 21:23:53 np0005592765.novalocal sudo[7842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:54 np0005592765.novalocal python3[7844]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 21:23:54 np0005592765.novalocal systemd[1]: Reloading.
Jan 22 21:23:54 np0005592765.novalocal systemd-rc-local-generator[7866]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:23:54 np0005592765.novalocal sudo[7842]: pam_unix(sudo:session): session closed for user root
Jan 22 21:23:55 np0005592765.novalocal sudo[7899]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hknyhcqsribvbpzpdqhfdxnenijjnlps ; /usr/bin/python3'
Jan 22 21:23:56 np0005592765.novalocal sudo[7899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:23:56 np0005592765.novalocal python3[7901]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 22 21:23:56 np0005592765.novalocal sudo[7899]: pam_unix(sudo:session): session closed for user root
Jan 22 21:24:01 np0005592765.novalocal sudo[7925]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxenzcgckesxduvhhuegdfgqnyjabzfp ; /usr/bin/python3'
Jan 22 21:24:01 np0005592765.novalocal sudo[7925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:24:01 np0005592765.novalocal python3[7927]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:24:01 np0005592765.novalocal sudo[7925]: pam_unix(sudo:session): session closed for user root
Jan 22 21:24:01 np0005592765.novalocal sudo[7953]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhluoygmcvmiyrmqqimlewqazphdjmaj ; /usr/bin/python3'
Jan 22 21:24:01 np0005592765.novalocal sudo[7953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:24:01 np0005592765.novalocal python3[7955]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:24:01 np0005592765.novalocal sudo[7953]: pam_unix(sudo:session): session closed for user root
Jan 22 21:24:01 np0005592765.novalocal sudo[7981]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvjtujkgrmmfbpikjbfvzuvcrnbwdnxx ; /usr/bin/python3'
Jan 22 21:24:01 np0005592765.novalocal sudo[7981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:24:02 np0005592765.novalocal python3[7983]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:24:02 np0005592765.novalocal sudo[7981]: pam_unix(sudo:session): session closed for user root
Jan 22 21:24:02 np0005592765.novalocal sudo[8009]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lezatgciprfeibdpoiwmndssofznklab ; /usr/bin/python3'
Jan 22 21:24:02 np0005592765.novalocal sudo[8009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:24:02 np0005592765.novalocal python3[8011]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:24:02 np0005592765.novalocal sudo[8009]: pam_unix(sudo:session): session closed for user root
Jan 22 21:24:03 np0005592765.novalocal python3[8038]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-c9eb-29a0-000000000caf-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:24:04 np0005592765.novalocal python3[8068]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 22 21:24:07 np0005592765.novalocal sshd-session[7484]: Connection closed by 38.102.83.114 port 37454
Jan 22 21:24:07 np0005592765.novalocal sshd-session[7481]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:24:07 np0005592765.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 22 21:24:07 np0005592765.novalocal systemd[1]: session-4.scope: Consumed 4.635s CPU time.
Jan 22 21:24:07 np0005592765.novalocal systemd-logind[801]: Session 4 logged out. Waiting for processes to exit.
Jan 22 21:24:07 np0005592765.novalocal systemd-logind[801]: Removed session 4.
Jan 22 21:24:09 np0005592765.novalocal sshd-session[8072]: Accepted publickey for zuul from 38.102.83.114 port 50572 ssh2: RSA SHA256:5hPaKzzgux2WRINR8nUX2B9zeLH3Zuh4lEFbJIE1uhE
Jan 22 21:24:09 np0005592765.novalocal systemd-logind[801]: New session 5 of user zuul.
Jan 22 21:24:09 np0005592765.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 22 21:24:09 np0005592765.novalocal sshd-session[8072]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:24:09 np0005592765.novalocal sudo[8099]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zobdrzcwdxwpgxnzhqbizqjhlzmtyzke ; /usr/bin/python3'
Jan 22 21:24:09 np0005592765.novalocal sudo[8099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:24:09 np0005592765.novalocal python3[8101]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 22 21:24:16 np0005592765.novalocal setsebool[8144]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 22 21:24:16 np0005592765.novalocal setsebool[8144]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  Converting 385 SID table entries...
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 21:24:27 np0005592765.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  Converting 388 SID table entries...
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 21:24:37 np0005592765.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 21:24:55 np0005592765.novalocal dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 21:24:55 np0005592765.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 21:24:55 np0005592765.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 22 21:24:55 np0005592765.novalocal systemd[1]: Reloading.
Jan 22 21:24:55 np0005592765.novalocal systemd-rc-local-generator[8916]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:24:55 np0005592765.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 21:24:56 np0005592765.novalocal sudo[8099]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:20 np0005592765.novalocal python3[19929]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f0c1-e77d-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:25:21 np0005592765.novalocal kernel: evm: overlay not supported
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: Starting D-Bus User Message Bus...
Jan 22 21:25:21 np0005592765.novalocal dbus-broker-launch[20380]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 22 21:25:21 np0005592765.novalocal dbus-broker-launch[20380]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: Started D-Bus User Message Bus.
Jan 22 21:25:21 np0005592765.novalocal dbus-broker-lau[20380]: Ready
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: Created slice Slice /user.
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: podman-20324.scope: unit configures an IP firewall, but not running as root.
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: (This warning is only shown for the first unit using IP firewalling.)
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: Started podman-20324.scope.
Jan 22 21:25:21 np0005592765.novalocal systemd[4311]: Started podman-pause-7e8dadcf.scope.
Jan 22 21:25:22 np0005592765.novalocal sudo[20689]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgruqlplnhwitdnevuuqwvnyffuznext ; /usr/bin/python3'
Jan 22 21:25:22 np0005592765.novalocal sudo[20689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:25:22 np0005592765.novalocal python3[20701]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.58:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.58:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:25:22 np0005592765.novalocal python3[20701]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 22 21:25:22 np0005592765.novalocal sudo[20689]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:23 np0005592765.novalocal sshd-session[8075]: Connection closed by 38.102.83.114 port 50572
Jan 22 21:25:23 np0005592765.novalocal sshd-session[8072]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:25:23 np0005592765.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 22 21:25:23 np0005592765.novalocal systemd[1]: session-5.scope: Consumed 43.749s CPU time.
Jan 22 21:25:23 np0005592765.novalocal systemd-logind[801]: Session 5 logged out. Waiting for processes to exit.
Jan 22 21:25:23 np0005592765.novalocal systemd-logind[801]: Removed session 5.
Jan 22 21:25:42 np0005592765.novalocal sshd-session[27044]: Unable to negotiate with 38.102.83.144 port 54974: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 22 21:25:42 np0005592765.novalocal sshd-session[27047]: Connection closed by 38.102.83.144 port 54942 [preauth]
Jan 22 21:25:42 np0005592765.novalocal sshd-session[27048]: Connection closed by 38.102.83.144 port 54948 [preauth]
Jan 22 21:25:42 np0005592765.novalocal sshd-session[27045]: Unable to negotiate with 38.102.83.144 port 54950: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 22 21:25:42 np0005592765.novalocal sshd-session[27051]: Unable to negotiate with 38.102.83.144 port 54966: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 22 21:25:48 np0005592765.novalocal sshd-session[28710]: Accepted publickey for zuul from 38.102.83.114 port 45816 ssh2: RSA SHA256:5hPaKzzgux2WRINR8nUX2B9zeLH3Zuh4lEFbJIE1uhE
Jan 22 21:25:48 np0005592765.novalocal systemd-logind[801]: New session 6 of user zuul.
Jan 22 21:25:48 np0005592765.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 22 21:25:48 np0005592765.novalocal sshd-session[28710]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:25:48 np0005592765.novalocal python3[28809]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP1bVa3UyQJYrIfR/D9+QGd9nQU79HczsFvkjJ9aX9AI5by0DEm0Wt09iGLM9Lsl9RzwuYfXi/K5FkpVq6cEErY= zuul@np0005592764.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:25:48 np0005592765.novalocal sudo[28940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhkwnhiejuxhhjxfztplrothxohttpx ; /usr/bin/python3'
Jan 22 21:25:48 np0005592765.novalocal sudo[28940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:25:49 np0005592765.novalocal python3[28948]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP1bVa3UyQJYrIfR/D9+QGd9nQU79HczsFvkjJ9aX9AI5by0DEm0Wt09iGLM9Lsl9RzwuYfXi/K5FkpVq6cEErY= zuul@np0005592764.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:25:49 np0005592765.novalocal sudo[28940]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:49 np0005592765.novalocal sudo[29280]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvandheeehqyuyehxrtzyeftsotsbfos ; /usr/bin/python3'
Jan 22 21:25:49 np0005592765.novalocal sudo[29280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:25:50 np0005592765.novalocal python3[29291]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005592765.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 22 21:25:50 np0005592765.novalocal useradd[29355]: new group: name=cloud-admin, GID=1002
Jan 22 21:25:50 np0005592765.novalocal useradd[29355]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 22 21:25:50 np0005592765.novalocal sudo[29280]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:50 np0005592765.novalocal sudo[29457]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxgaskxfjzzrmfrflqomczbefdeicgbi ; /usr/bin/python3'
Jan 22 21:25:50 np0005592765.novalocal sudo[29457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:25:50 np0005592765.novalocal python3[29467]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP1bVa3UyQJYrIfR/D9+QGd9nQU79HczsFvkjJ9aX9AI5by0DEm0Wt09iGLM9Lsl9RzwuYfXi/K5FkpVq6cEErY= zuul@np0005592764.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 21:25:50 np0005592765.novalocal sudo[29457]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:51 np0005592765.novalocal sudo[29684]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugjcfbxjizsromqgjhemyaxtritmoazw ; /usr/bin/python3'
Jan 22 21:25:51 np0005592765.novalocal sudo[29684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:25:51 np0005592765.novalocal python3[29693]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:25:51 np0005592765.novalocal sudo[29684]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:51 np0005592765.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 21:25:51 np0005592765.novalocal systemd[1]: Finished man-db-cache-update.service.
Jan 22 21:25:51 np0005592765.novalocal systemd[1]: man-db-cache-update.service: Consumed 1min 9.415s CPU time.
Jan 22 21:25:51 np0005592765.novalocal systemd[1]: run-re9622d4d16ec4bd281177858a5069cf5.service: Deactivated successfully.
Jan 22 21:25:51 np0005592765.novalocal sudo[29859]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vonjlmpnrrvgxibaadmegwvaqgzaynaa ; /usr/bin/python3'
Jan 22 21:25:51 np0005592765.novalocal sudo[29859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:25:51 np0005592765.novalocal python3[29861]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769117150.8416355-167-223002244017489/source _original_basename=tmps9nbxpto follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:25:51 np0005592765.novalocal sudo[29859]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:52 np0005592765.novalocal sudo[29909]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txgupxmlmrgeudjfjheeddazvwzdfspd ; /usr/bin/python3'
Jan 22 21:25:52 np0005592765.novalocal sudo[29909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:25:52 np0005592765.novalocal python3[29911]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 22 21:25:52 np0005592765.novalocal systemd[1]: Starting Hostname Service...
Jan 22 21:25:52 np0005592765.novalocal systemd[1]: Started Hostname Service.
Jan 22 21:25:52 np0005592765.novalocal systemd-hostnamed[29915]: Changed pretty hostname to 'compute-0'
Jan 22 21:25:52 compute-0 systemd-hostnamed[29915]: Hostname set to <compute-0> (static)
Jan 22 21:25:52 compute-0 NetworkManager[7195]: <info>  [1769117152.7457] hostname: static hostname changed from "np0005592765.novalocal" to "compute-0"
Jan 22 21:25:52 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 21:25:52 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 21:25:52 compute-0 sudo[29909]: pam_unix(sudo:session): session closed for user root
Jan 22 21:25:53 compute-0 sshd-session[28754]: Connection closed by 38.102.83.114 port 45816
Jan 22 21:25:53 compute-0 sshd-session[28710]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:25:53 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 22 21:25:53 compute-0 systemd[1]: session-6.scope: Consumed 2.695s CPU time.
Jan 22 21:25:53 compute-0 systemd-logind[801]: Session 6 logged out. Waiting for processes to exit.
Jan 22 21:25:53 compute-0 systemd-logind[801]: Removed session 6.
Jan 22 21:26:02 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 21:26:22 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 21:27:45 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 22 21:27:45 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 22 21:27:45 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 22 21:27:45 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 22 21:30:26 compute-0 sshd-session[29937]: Accepted publickey for zuul from 38.102.83.144 port 45294 ssh2: RSA SHA256:5hPaKzzgux2WRINR8nUX2B9zeLH3Zuh4lEFbJIE1uhE
Jan 22 21:30:26 compute-0 systemd-logind[801]: New session 7 of user zuul.
Jan 22 21:30:26 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 22 21:30:26 compute-0 sshd-session[29937]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:30:26 compute-0 python3[30013]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:30:28 compute-0 sudo[30127]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbvdcomuwwgaizjqncwluzgguuksmqee ; /usr/bin/python3'
Jan 22 21:30:28 compute-0 sudo[30127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:28 compute-0 python3[30129]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:30:28 compute-0 sudo[30127]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:28 compute-0 sudo[30200]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buqwaedcmhzouuhnhbmeswdepykhvsfz ; /usr/bin/python3'
Jan 22 21:30:28 compute-0 sudo[30200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:28 compute-0 python3[30202]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.114687-34004-79480675437793/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:30:28 compute-0 sudo[30200]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:29 compute-0 sudo[30226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkvghukafkirdlbibtahepdsvriythgu ; /usr/bin/python3'
Jan 22 21:30:29 compute-0 sudo[30226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:29 compute-0 python3[30228]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:30:29 compute-0 sudo[30226]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:29 compute-0 sudo[30299]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crdezlwvsrtphlsarnpxyuwobpmmijsq ; /usr/bin/python3'
Jan 22 21:30:29 compute-0 sudo[30299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:29 compute-0 python3[30301]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.114687-34004-79480675437793/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:30:29 compute-0 sudo[30299]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:29 compute-0 sudo[30325]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnyyaiprtihuropsgukayxspiqxqxhxr ; /usr/bin/python3'
Jan 22 21:30:29 compute-0 sudo[30325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:29 compute-0 python3[30327]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:30:29 compute-0 sudo[30325]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:30 compute-0 sudo[30398]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qikuowywypfouitfabdenlrgfowqhafn ; /usr/bin/python3'
Jan 22 21:30:30 compute-0 sudo[30398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:30 compute-0 python3[30400]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.114687-34004-79480675437793/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:30:30 compute-0 sudo[30398]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:30 compute-0 sudo[30424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejwahqbqmlbbjojlaatgscnycnmlvhba ; /usr/bin/python3'
Jan 22 21:30:30 compute-0 sudo[30424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:30 compute-0 python3[30426]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:30:30 compute-0 sudo[30424]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:30 compute-0 sudo[30497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csrdrslnzgzdoozgzhrrjzamxoenpsrj ; /usr/bin/python3'
Jan 22 21:30:30 compute-0 sudo[30497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:31 compute-0 python3[30499]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.114687-34004-79480675437793/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:30:31 compute-0 sudo[30497]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:31 compute-0 sudo[30523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lastpvhpsmmcueecozjmcsfbivvhoxbs ; /usr/bin/python3'
Jan 22 21:30:31 compute-0 sudo[30523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:31 compute-0 python3[30525]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:30:31 compute-0 sudo[30523]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:31 compute-0 sudo[30596]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeylihgkuzxqtoapyszrivmxbmhfrkck ; /usr/bin/python3'
Jan 22 21:30:31 compute-0 sudo[30596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:31 compute-0 python3[30598]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.114687-34004-79480675437793/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:30:31 compute-0 sudo[30596]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:31 compute-0 sudo[30622]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgztinaidvqhwqyaohmdhurpdmotcwag ; /usr/bin/python3'
Jan 22 21:30:31 compute-0 sudo[30622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:31 compute-0 python3[30624]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:30:31 compute-0 sudo[30622]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:32 compute-0 sudo[30695]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzlakzxwpwebggwhizxyjjibaxgoinhh ; /usr/bin/python3'
Jan 22 21:30:32 compute-0 sudo[30695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:32 compute-0 python3[30697]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.114687-34004-79480675437793/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:30:32 compute-0 sudo[30695]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:32 compute-0 sudo[30721]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsixjzypkhtjzqzijrxkanpwnygrkybs ; /usr/bin/python3'
Jan 22 21:30:32 compute-0 sudo[30721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:32 compute-0 python3[30723]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 21:30:32 compute-0 sudo[30721]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:32 compute-0 sudo[30794]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rklxwyiuikuoesvihgeorkpaebatozvd ; /usr/bin/python3'
Jan 22 21:30:32 compute-0 sudo[30794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:30:33 compute-0 python3[30796]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.114687-34004-79480675437793/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:30:33 compute-0 sudo[30794]: pam_unix(sudo:session): session closed for user root
Jan 22 21:30:35 compute-0 sshd-session[30821]: Unable to negotiate with 192.168.122.11 port 33118: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 22 21:30:35 compute-0 sshd-session[30822]: Connection closed by 192.168.122.11 port 33104 [preauth]
Jan 22 21:30:35 compute-0 sshd-session[30823]: Connection closed by 192.168.122.11 port 33090 [preauth]
Jan 22 21:30:35 compute-0 sshd-session[30826]: Unable to negotiate with 192.168.122.11 port 33124: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 22 21:30:35 compute-0 sshd-session[30825]: Unable to negotiate with 192.168.122.11 port 33136: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 22 21:30:41 compute-0 python3[30854]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:35:41 compute-0 sshd-session[29940]: Received disconnect from 38.102.83.144 port 45294:11: disconnected by user
Jan 22 21:35:41 compute-0 sshd-session[29940]: Disconnected from user zuul 38.102.83.144 port 45294
Jan 22 21:35:41 compute-0 sshd-session[29937]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:35:41 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 22 21:35:41 compute-0 systemd[1]: session-7.scope: Consumed 5.799s CPU time.
Jan 22 21:35:41 compute-0 systemd-logind[801]: Session 7 logged out. Waiting for processes to exit.
Jan 22 21:35:41 compute-0 systemd-logind[801]: Removed session 7.
Jan 22 21:38:24 compute-0 sshd-session[30861]: Connection closed by 203.156.216.10 port 44131
Jan 22 21:40:26 compute-0 sshd[1009]: Timeout before authentication for connection from 203.156.216.10 to 38.102.83.50, pid = 30862
Jan 22 21:41:37 compute-0 sshd-session[30865]: Connection closed by 45.148.10.121 port 60664 [preauth]
Jan 22 21:46:06 compute-0 sshd-session[30869]: Accepted publickey for zuul from 192.168.122.30 port 57752 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:46:06 compute-0 systemd-logind[801]: New session 8 of user zuul.
Jan 22 21:46:06 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 22 21:46:06 compute-0 sshd-session[30869]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:46:07 compute-0 python3.9[31022]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:46:08 compute-0 sudo[31201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltnwaspgcijmokcmuvpnbswxzjhxjluh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118367.8690953-56-193846089307753/AnsiballZ_command.py'
Jan 22 21:46:08 compute-0 sudo[31201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:08 compute-0 python3.9[31203]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:46:18 compute-0 sudo[31201]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:25 compute-0 sshd-session[30872]: Connection closed by 192.168.122.30 port 57752
Jan 22 21:46:25 compute-0 sshd-session[30869]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:46:25 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 22 21:46:25 compute-0 systemd[1]: session-8.scope: Consumed 8.565s CPU time.
Jan 22 21:46:25 compute-0 systemd-logind[801]: Session 8 logged out. Waiting for processes to exit.
Jan 22 21:46:25 compute-0 systemd-logind[801]: Removed session 8.
Jan 22 21:46:41 compute-0 sshd-session[31261]: Accepted publickey for zuul from 192.168.122.30 port 46878 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:46:41 compute-0 systemd-logind[801]: New session 9 of user zuul.
Jan 22 21:46:41 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 22 21:46:41 compute-0 sshd-session[31261]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:46:42 compute-0 python3.9[31414]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 22 21:46:43 compute-0 python3.9[31588]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:46:44 compute-0 sudo[31738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rriklzckyyvufnhixlmhakbrbpbwtdjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118403.5701554-93-202638311582765/AnsiballZ_command.py'
Jan 22 21:46:44 compute-0 sudo[31738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:44 compute-0 python3.9[31740]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:46:44 compute-0 sudo[31738]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:45 compute-0 sudo[31891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrztfgrpfqzvjullyaneuiznovytvljk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118404.6806767-129-24252506001384/AnsiballZ_stat.py'
Jan 22 21:46:45 compute-0 sudo[31891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:45 compute-0 python3.9[31893]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:46:45 compute-0 sudo[31891]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:46 compute-0 sudo[32043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cijzquobsduqaopesxuwxpqjdafvryxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118405.5709019-153-125610891797618/AnsiballZ_file.py'
Jan 22 21:46:46 compute-0 sudo[32043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:46 compute-0 python3.9[32045]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:46:46 compute-0 sudo[32043]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:47 compute-0 sudo[32195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpmnoqmwuidzoqejbvdzufhjrfszuheq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118406.6768017-177-58210967426099/AnsiballZ_stat.py'
Jan 22 21:46:47 compute-0 sudo[32195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:47 compute-0 python3.9[32197]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:46:47 compute-0 sudo[32195]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:47 compute-0 sudo[32318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaimfiwjchblatlwjahtoiesvtdgolfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118406.6768017-177-58210967426099/AnsiballZ_copy.py'
Jan 22 21:46:47 compute-0 sudo[32318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:47 compute-0 python3.9[32320]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118406.6768017-177-58210967426099/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:46:47 compute-0 sudo[32318]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:48 compute-0 sudo[32470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnjdlxpjvnfoktqrmanpdywafrzxrxzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118408.160696-222-221440977507543/AnsiballZ_setup.py'
Jan 22 21:46:48 compute-0 sudo[32470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:48 compute-0 python3.9[32472]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:46:49 compute-0 sudo[32470]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:49 compute-0 sudo[32626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqixrncwpodxvyvkqzrveikwujemqtgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118409.2738805-246-81877863908305/AnsiballZ_file.py'
Jan 22 21:46:49 compute-0 sudo[32626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:49 compute-0 python3.9[32628]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:46:49 compute-0 sudo[32626]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:50 compute-0 sudo[32778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tloodwxktoaonqidtlvabzaqoqbukguu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118410.0697725-273-103219568248598/AnsiballZ_file.py'
Jan 22 21:46:50 compute-0 sudo[32778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:46:50 compute-0 python3.9[32780]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:46:50 compute-0 sudo[32778]: pam_unix(sudo:session): session closed for user root
Jan 22 21:46:51 compute-0 python3.9[32930]: ansible-ansible.builtin.service_facts Invoked
Jan 22 21:46:57 compute-0 python3.9[33183]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:46:58 compute-0 python3.9[33333]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:46:59 compute-0 python3.9[33487]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:47:00 compute-0 sudo[33643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvbfimlsvxrelowsqvxixkwnvhtnfprh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118419.8397255-417-70266024107929/AnsiballZ_setup.py'
Jan 22 21:47:00 compute-0 sudo[33643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:47:00 compute-0 python3.9[33645]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:47:00 compute-0 sudo[33643]: pam_unix(sudo:session): session closed for user root
Jan 22 21:47:01 compute-0 sudo[33727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reulwgrynwvncsureymoxxsspbfibevu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118419.8397255-417-70266024107929/AnsiballZ_dnf.py'
Jan 22 21:47:01 compute-0 sudo[33727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:47:01 compute-0 python3.9[33729]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:47:59 compute-0 systemd[1]: Reloading.
Jan 22 21:47:59 compute-0 systemd-rc-local-generator[33920]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:47:59 compute-0 systemd[1]: Starting dnf makecache...
Jan 22 21:47:59 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 22 21:47:59 compute-0 dnf[33936]: Failed determining last makecache time.
Jan 22 21:47:59 compute-0 dnf[33936]: delorean-openstack-barbican-42b4c41831408a8e323 153 kB/s | 3.0 kB     00:00
Jan 22 21:47:59 compute-0 systemd[1]: Reloading.
Jan 22 21:48:00 compute-0 systemd-rc-local-generator[33972]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:48:00 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 22 21:48:00 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 22 21:48:00 compute-0 systemd[1]: Reloading.
Jan 22 21:48:00 compute-0 systemd-rc-local-generator[34011]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 6.3 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 22 21:48:00 compute-0 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Jan 22 21:48:00 compute-0 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Jan 22 21:48:00 compute-0 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-cinder-1c00d6490d88e436f26ef 9.9 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-python-stevedore-c4acc5639fd2329372142 132 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-python-cloudkitty-tests-tempest-2c80f8 141 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-os-refresh-config-9bfc52b5049be2d8de61 132 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 147 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-python-designate-tests-tempest-347fdbc 155 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-glance-1fd12c29b339f30fe823e 160 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 162 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-manila-3c01b7181572c95dac462 165 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-python-whitebox-neutron-tests-tempest- 174 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-octavia-ba397f07a7331190208c 188 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-watcher-c014f81a8647287f6dcc 199 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-ansible-config_template-5ccaa22121a7ff 196 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 205 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-openstack-swift-dc98a8463506ac520c469a 193 kB/s | 3.0 kB     00:00
Jan 22 21:48:00 compute-0 dnf[33936]: delorean-python-tempestconf-8515371b7cceebd4282 205 kB/s | 3.0 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: delorean-openstack-heat-ui-013accbfd179753bc3f0 186 kB/s | 3.0 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: CentOS Stream 9 - BaseOS                         61 kB/s | 6.7 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: CentOS Stream 9 - AppStream                      63 kB/s | 6.8 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: CentOS Stream 9 - CRB                            58 kB/s | 6.6 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: CentOS Stream 9 - Extras packages                31 kB/s | 7.3 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: dlrn-antelope-testing                           134 kB/s | 3.0 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: dlrn-antelope-build-deps                        140 kB/s | 3.0 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: centos9-rabbitmq                                 97 kB/s | 3.0 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: centos9-storage                                  96 kB/s | 3.0 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: centos9-opstools                                100 kB/s | 3.0 kB     00:00
Jan 22 21:48:01 compute-0 dnf[33936]: NFV SIG OpenvSwitch                              88 kB/s | 3.0 kB     00:00
Jan 22 21:48:02 compute-0 dnf[33936]: repo-setup-centos-appstream                     167 kB/s | 4.4 kB     00:00
Jan 22 21:48:02 compute-0 dnf[33936]: repo-setup-centos-baseos                        167 kB/s | 3.9 kB     00:00
Jan 22 21:48:02 compute-0 dnf[33936]: repo-setup-centos-highavailability              107 kB/s | 3.9 kB     00:00
Jan 22 21:48:02 compute-0 dnf[33936]: repo-setup-centos-powertools                    160 kB/s | 4.3 kB     00:00
Jan 22 21:48:02 compute-0 dnf[33936]: Extra Packages for Enterprise Linux 9 - x86_64  180 kB/s |  28 kB     00:00
Jan 22 21:48:03 compute-0 dnf[33936]: Metadata cache created.
Jan 22 21:48:03 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 21:48:03 compute-0 systemd[1]: Finished dnf makecache.
Jan 22 21:48:03 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.971s CPU time.
Jan 22 21:49:17 compute-0 kernel: SELinux:  Converting 2724 SID table entries...
Jan 22 21:49:18 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 21:49:18 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 21:49:18 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 21:49:18 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 21:49:18 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 21:49:18 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 21:49:18 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 21:49:18 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 22 21:49:18 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 21:49:18 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 21:49:18 compute-0 systemd[1]: Reloading.
Jan 22 21:49:18 compute-0 systemd-rc-local-generator[34362]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:49:19 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 21:49:19 compute-0 sudo[33727]: pam_unix(sudo:session): session closed for user root
Jan 22 21:49:20 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 21:49:20 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 21:49:20 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.510s CPU time.
Jan 22 21:49:20 compute-0 systemd[1]: run-r686942ebe06846158982e0531d8b2572.service: Deactivated successfully.
Jan 22 21:49:49 compute-0 sudo[35282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjjdrkoefekizloembthnijtoisgpwzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118589.1843755-453-65267700721732/AnsiballZ_command.py'
Jan 22 21:49:49 compute-0 sudo[35282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:49:49 compute-0 python3.9[35284]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:49:50 compute-0 sudo[35282]: pam_unix(sudo:session): session closed for user root
Jan 22 21:49:51 compute-0 sudo[35563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cohyfeqbtqgjcobnggfqnxkqvpdjklpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118590.7849424-477-193093060283626/AnsiballZ_selinux.py'
Jan 22 21:49:51 compute-0 sudo[35563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:49:51 compute-0 python3.9[35565]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 22 21:49:51 compute-0 sudo[35563]: pam_unix(sudo:session): session closed for user root
Jan 22 21:49:52 compute-0 sudo[35715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzseamcvfzoxpnrqbdfhoxmszfficjab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118592.1960819-510-92098396699522/AnsiballZ_command.py'
Jan 22 21:49:52 compute-0 sudo[35715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:49:52 compute-0 python3.9[35717]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 22 21:49:53 compute-0 sudo[35715]: pam_unix(sudo:session): session closed for user root
Jan 22 21:49:54 compute-0 sudo[35868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgvgfzxqqjmbstwlxhrkwzsorhxtgxob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118593.778348-534-208009617622594/AnsiballZ_file.py'
Jan 22 21:49:54 compute-0 sudo[35868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:49:55 compute-0 python3.9[35870]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:49:55 compute-0 sudo[35868]: pam_unix(sudo:session): session closed for user root
Jan 22 21:49:55 compute-0 sudo[36020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scqydckpedrefcyeswnhcntclkoxgxjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118595.2951226-558-169633457421172/AnsiballZ_mount.py'
Jan 22 21:49:55 compute-0 sudo[36020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:49:56 compute-0 python3.9[36022]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 22 21:49:56 compute-0 sudo[36020]: pam_unix(sudo:session): session closed for user root
Jan 22 21:49:58 compute-0 sudo[36172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzftapltnuvyzupudabvpjiofnhxsujx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118597.811514-642-53857418124597/AnsiballZ_file.py'
Jan 22 21:49:58 compute-0 sudo[36172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:49:58 compute-0 python3.9[36174]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:49:58 compute-0 sudo[36172]: pam_unix(sudo:session): session closed for user root
Jan 22 21:49:58 compute-0 sudo[36324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfpvapuwoknmkozadnnzlvhzlhkxufxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118598.5200648-666-211993925837341/AnsiballZ_stat.py'
Jan 22 21:49:58 compute-0 sudo[36324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:00 compute-0 python3.9[36326]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:50:00 compute-0 sudo[36324]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:00 compute-0 sudo[36447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybdgfglvmazuphipballcrlxjwmupayl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118598.5200648-666-211993925837341/AnsiballZ_copy.py'
Jan 22 21:50:00 compute-0 sudo[36447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:01 compute-0 python3.9[36449]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118598.5200648-666-211993925837341/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:50:01 compute-0 sudo[36447]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:07 compute-0 sudo[36600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxrnzlczjbuiflpzxphtphuroffuplip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118607.3774052-738-58610062317809/AnsiballZ_stat.py'
Jan 22 21:50:07 compute-0 sudo[36600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:07 compute-0 python3.9[36602]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:50:07 compute-0 sudo[36600]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:08 compute-0 sudo[36752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfbyxmcbvbtmeehnavweomvqmqhcmeci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118608.0768085-762-69667615904665/AnsiballZ_command.py'
Jan 22 21:50:08 compute-0 sudo[36752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:08 compute-0 python3.9[36754]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:50:08 compute-0 sudo[36752]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:09 compute-0 sudo[36905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchtoxnbcvzdsaywxipcixunfvwokkwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118609.3131537-786-203897590790597/AnsiballZ_file.py'
Jan 22 21:50:09 compute-0 sudo[36905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:09 compute-0 python3.9[36907]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:50:09 compute-0 sudo[36905]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:10 compute-0 sudo[37057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcejeauvlunslshqtacpzilixfnrcosf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118610.2795775-819-8997081842713/AnsiballZ_getent.py'
Jan 22 21:50:10 compute-0 sudo[37057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:10 compute-0 python3.9[37059]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 22 21:50:10 compute-0 sudo[37057]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:10 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 21:50:11 compute-0 sudo[37211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwwiasqvwckhalsvgitjxvrrycwxnisd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118611.1015413-843-43437466267065/AnsiballZ_group.py'
Jan 22 21:50:11 compute-0 sudo[37211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:11 compute-0 python3.9[37213]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 21:50:12 compute-0 groupadd[37214]: group added to /etc/group: name=qemu, GID=107
Jan 22 21:50:12 compute-0 groupadd[37214]: group added to /etc/gshadow: name=qemu
Jan 22 21:50:12 compute-0 groupadd[37214]: new group: name=qemu, GID=107
Jan 22 21:50:12 compute-0 sudo[37211]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:12 compute-0 sudo[37369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gizqdnfeefvhdzdvgkypphsyhkoxyrlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118612.3648024-867-124329177641886/AnsiballZ_user.py'
Jan 22 21:50:12 compute-0 sudo[37369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:13 compute-0 python3.9[37371]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 21:50:13 compute-0 useradd[37373]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 21:50:13 compute-0 sudo[37369]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:14 compute-0 sudo[37529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noucpvgayljzbxtvaqklbommjfqravfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118613.8588536-891-187024769075792/AnsiballZ_getent.py'
Jan 22 21:50:14 compute-0 sudo[37529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:14 compute-0 python3.9[37531]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 22 21:50:14 compute-0 sudo[37529]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:14 compute-0 sudo[37682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzdcancehwmgnybvvkcwkqftgnqzlhct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118614.611277-915-59758525080410/AnsiballZ_group.py'
Jan 22 21:50:14 compute-0 sudo[37682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:15 compute-0 python3.9[37684]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 21:50:15 compute-0 groupadd[37685]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 22 21:50:15 compute-0 groupadd[37685]: group added to /etc/gshadow: name=hugetlbfs
Jan 22 21:50:15 compute-0 groupadd[37685]: new group: name=hugetlbfs, GID=42477
Jan 22 21:50:15 compute-0 sudo[37682]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:15 compute-0 sudo[37840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojjkexobcduxxuzdykduyebjcpvseipm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118615.6200402-942-111097411706497/AnsiballZ_file.py'
Jan 22 21:50:15 compute-0 sudo[37840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:16 compute-0 python3.9[37842]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 22 21:50:16 compute-0 sudo[37840]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:16 compute-0 sudo[37992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trjwfojcpuiyqldzyuhwcfaugbybxajh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118616.5274448-975-81284052772584/AnsiballZ_dnf.py'
Jan 22 21:50:16 compute-0 sudo[37992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:17 compute-0 python3.9[37994]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:50:20 compute-0 sudo[37992]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:20 compute-0 sudo[38145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuliyjmzsnukwkxbvybrevyoyyoemztx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118620.2826955-999-99007777542633/AnsiballZ_file.py'
Jan 22 21:50:20 compute-0 sudo[38145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:20 compute-0 python3.9[38147]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:50:20 compute-0 sudo[38145]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:21 compute-0 sudo[38297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czagrcsebkbcoiwxagmumeedanmfevdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118621.0068202-1023-210596324753394/AnsiballZ_stat.py'
Jan 22 21:50:21 compute-0 sudo[38297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:21 compute-0 python3.9[38299]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:50:21 compute-0 sudo[38297]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:21 compute-0 sudo[38420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhpfqzpdujlgvkhwyipimjsvnivzcvxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118621.0068202-1023-210596324753394/AnsiballZ_copy.py'
Jan 22 21:50:21 compute-0 sudo[38420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:22 compute-0 python3.9[38422]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118621.0068202-1023-210596324753394/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:50:22 compute-0 sudo[38420]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:22 compute-0 sudo[38572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzhxonmqzmsqifexnfoczogoixtwtsjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118622.3979998-1068-28931102664575/AnsiballZ_systemd.py'
Jan 22 21:50:22 compute-0 sudo[38572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:23 compute-0 python3.9[38574]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:50:23 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 22 21:50:23 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 22 21:50:23 compute-0 kernel: Bridge firewalling registered
Jan 22 21:50:23 compute-0 systemd-modules-load[38578]: Inserted module 'br_netfilter'
Jan 22 21:50:23 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 22 21:50:23 compute-0 sudo[38572]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:23 compute-0 sudo[38734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omzgsnqkijytybtjfrcuzhfkhhkctzji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118623.6570153-1092-102976508348170/AnsiballZ_stat.py'
Jan 22 21:50:23 compute-0 sudo[38734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:24 compute-0 python3.9[38736]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:50:24 compute-0 sudo[38734]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:24 compute-0 sudo[38857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmmbruhzwihoqxlolxmoufwaxlxzwqvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118623.6570153-1092-102976508348170/AnsiballZ_copy.py'
Jan 22 21:50:24 compute-0 sudo[38857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:24 compute-0 python3.9[38859]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118623.6570153-1092-102976508348170/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:50:24 compute-0 sudo[38857]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:25 compute-0 sudo[39009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srhlhqlqggkgnipjjdpbmpqfnojydeps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118625.3437417-1146-263792040534303/AnsiballZ_dnf.py'
Jan 22 21:50:25 compute-0 sudo[39009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:25 compute-0 python3.9[39011]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:50:29 compute-0 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Jan 22 21:50:29 compute-0 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Jan 22 21:50:29 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 21:50:29 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 21:50:29 compute-0 systemd[1]: Reloading.
Jan 22 21:50:29 compute-0 systemd-rc-local-generator[39075]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:50:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 21:50:30 compute-0 sudo[39009]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:34 compute-0 python3.9[42723]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:50:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 21:50:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 21:50:34 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.272s CPU time.
Jan 22 21:50:34 compute-0 systemd[1]: run-r94b04f7b0b1946baaf22bc8c5a8193e2.service: Deactivated successfully.
Jan 22 21:50:35 compute-0 python3.9[42876]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 22 21:50:36 compute-0 python3.9[43026]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:50:37 compute-0 sudo[43176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntcmskxxzybbmjzksrlbijauwcjtygzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118636.9064417-1263-31808948237535/AnsiballZ_command.py'
Jan 22 21:50:37 compute-0 sudo[43176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:37 compute-0 python3.9[43178]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:50:37 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 21:50:38 compute-0 systemd[1]: Starting Authorization Manager...
Jan 22 21:50:38 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 21:50:38 compute-0 polkitd[43395]: Started polkitd version 0.117
Jan 22 21:50:38 compute-0 polkitd[43395]: Loading rules from directory /etc/polkit-1/rules.d
Jan 22 21:50:38 compute-0 polkitd[43395]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 22 21:50:38 compute-0 polkitd[43395]: Finished loading, compiling and executing 2 rules
Jan 22 21:50:38 compute-0 systemd[1]: Started Authorization Manager.
Jan 22 21:50:38 compute-0 polkitd[43395]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 22 21:50:38 compute-0 sudo[43176]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:38 compute-0 sudo[43563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thvfqpnwlslwqhlwokpamaacdhhtypjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118638.4251246-1290-154863353171289/AnsiballZ_systemd.py'
Jan 22 21:50:38 compute-0 sudo[43563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:39 compute-0 python3.9[43565]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:50:39 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 22 21:50:39 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 22 21:50:39 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 22 21:50:39 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 21:50:39 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 21:50:39 compute-0 sudo[43563]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:40 compute-0 python3.9[43727]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 22 21:50:43 compute-0 sudo[43877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gameqlcwhadqklqulporwazwgwjmeije ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118643.2316868-1461-125187639525427/AnsiballZ_systemd.py'
Jan 22 21:50:43 compute-0 sudo[43877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:43 compute-0 python3.9[43879]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:50:43 compute-0 systemd[1]: Reloading.
Jan 22 21:50:44 compute-0 systemd-rc-local-generator[43908]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:50:44 compute-0 sudo[43877]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:44 compute-0 sudo[44065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbzkunfdocyiueqwmpmdzimpffqyhmqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118644.3784745-1461-132863845540610/AnsiballZ_systemd.py'
Jan 22 21:50:44 compute-0 sudo[44065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:45 compute-0 python3.9[44067]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:50:45 compute-0 systemd[1]: Reloading.
Jan 22 21:50:45 compute-0 systemd-rc-local-generator[44094]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:50:45 compute-0 sudo[44065]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:46 compute-0 sudo[44254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esrncmfaxfnuepntnufqsbxfgrdamwle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118645.6777265-1509-168992729304080/AnsiballZ_command.py'
Jan 22 21:50:46 compute-0 sudo[44254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:46 compute-0 python3.9[44256]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:50:46 compute-0 sudo[44254]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:46 compute-0 sudo[44407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbdnbzguubhqazfdemngicrcvggenuwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118646.432103-1533-85069939965779/AnsiballZ_command.py'
Jan 22 21:50:46 compute-0 sudo[44407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:46 compute-0 python3.9[44409]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:50:46 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 22 21:50:46 compute-0 sudo[44407]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:47 compute-0 sudo[44560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xknqlemvkeunwlpxoricbewkcwxsfraz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118647.1660604-1557-21469551706625/AnsiballZ_command.py'
Jan 22 21:50:47 compute-0 sudo[44560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:47 compute-0 python3.9[44562]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:50:49 compute-0 sudo[44560]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:49 compute-0 sudo[44722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arwnmdqpqroobtsiatjoakffkhtwilka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118649.4588096-1581-32439593067203/AnsiballZ_command.py'
Jan 22 21:50:49 compute-0 sudo[44722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:49 compute-0 python3.9[44724]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:50:50 compute-0 sudo[44722]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:50 compute-0 sudo[44875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbudqatojqfckfofsodmdwkasdieizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118650.185604-1605-98997100330477/AnsiballZ_systemd.py'
Jan 22 21:50:50 compute-0 sudo[44875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:50:50 compute-0 python3.9[44877]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:50:50 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 21:50:50 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 22 21:50:50 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 22 21:50:50 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 22 21:50:50 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 22 21:50:50 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 22 21:50:51 compute-0 sudo[44875]: pam_unix(sudo:session): session closed for user root
Jan 22 21:50:52 compute-0 sshd-session[31264]: Connection closed by 192.168.122.30 port 46878
Jan 22 21:50:52 compute-0 sshd-session[31261]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:50:52 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 22 21:50:52 compute-0 systemd[1]: session-9.scope: Consumed 2min 21.467s CPU time.
Jan 22 21:50:52 compute-0 systemd-logind[801]: Session 9 logged out. Waiting for processes to exit.
Jan 22 21:50:52 compute-0 systemd-logind[801]: Removed session 9.
Jan 22 21:50:58 compute-0 sshd-session[44907]: Accepted publickey for zuul from 192.168.122.30 port 36878 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:50:58 compute-0 systemd-logind[801]: New session 10 of user zuul.
Jan 22 21:50:58 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 22 21:50:58 compute-0 sshd-session[44907]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:50:59 compute-0 python3.9[45060]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:51:00 compute-0 python3.9[45214]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:51:01 compute-0 sudo[45368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moumuvqfvbdikrmkmrmykpndmuuswpye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118661.2927988-110-20905687228847/AnsiballZ_command.py'
Jan 22 21:51:01 compute-0 sudo[45368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:01 compute-0 python3.9[45370]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:51:01 compute-0 sudo[45368]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:03 compute-0 python3.9[45521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:51:03 compute-0 sudo[45676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsjtqygmrfkcvoqsrchujimgmfuttsna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118663.4562654-170-239449700015572/AnsiballZ_setup.py'
Jan 22 21:51:03 compute-0 sudo[45676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:04 compute-0 python3.9[45678]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:51:04 compute-0 sudo[45676]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:04 compute-0 sudo[45760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwyfknelrmscvajhzbnvaqqinpbfhikb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118663.4562654-170-239449700015572/AnsiballZ_dnf.py'
Jan 22 21:51:04 compute-0 sudo[45760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:05 compute-0 python3.9[45762]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:51:06 compute-0 sudo[45760]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:06 compute-0 sudo[45913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrpbtlmzqquvawbomqlsudbvbhwgismp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118666.4339826-206-191835914578996/AnsiballZ_setup.py'
Jan 22 21:51:06 compute-0 sudo[45913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:07 compute-0 python3.9[45915]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:51:07 compute-0 sudo[45913]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:08 compute-0 sudo[46084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-minoygtqwrdtlwwcgkdjbdgqjybycldn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118667.6550944-239-83005009572521/AnsiballZ_file.py'
Jan 22 21:51:08 compute-0 sudo[46084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:08 compute-0 python3.9[46086]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:51:08 compute-0 sudo[46084]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:10 compute-0 sudo[46236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juywhpqiburjixjsopluthtoaabvzomf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118670.4187062-263-130951903380809/AnsiballZ_command.py'
Jan 22 21:51:10 compute-0 sudo[46236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:10 compute-0 python3.9[46238]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:51:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3481352579-merged.mount: Deactivated successfully.
Jan 22 21:51:10 compute-0 podman[46239]: 2026-01-22 21:51:10.943016739 +0000 UTC m=+0.048014717 system refresh
Jan 22 21:51:10 compute-0 sudo[46236]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:11 compute-0 sudo[46398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsjwghqaxqhqzcwmbufdkcccnoicvbsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118671.1862772-287-134712136973355/AnsiballZ_stat.py'
Jan 22 21:51:11 compute-0 sudo[46398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:11 compute-0 python3.9[46400]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:51:11 compute-0 sudo[46398]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:51:14 compute-0 sudo[46521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfqezoylfpavejwdilwnfrewtoovfzul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118671.1862772-287-134712136973355/AnsiballZ_copy.py'
Jan 22 21:51:14 compute-0 sudo[46521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:14 compute-0 python3.9[46523]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118671.1862772-287-134712136973355/.source.json follow=False _original_basename=podman_network_config.j2 checksum=229183e8a4c8328d915efef9549b347ea9783741 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:51:14 compute-0 sudo[46521]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:15 compute-0 sudo[46673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxfszwsdafsonlicdcuxmrhqmnznsmcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118675.2031598-332-139839433267608/AnsiballZ_stat.py'
Jan 22 21:51:15 compute-0 sudo[46673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:15 compute-0 python3.9[46675]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:51:15 compute-0 sudo[46673]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:16 compute-0 sudo[46796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wecxqqtlbtqbhfodrbhjgwbcfofyrznc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118675.2031598-332-139839433267608/AnsiballZ_copy.py'
Jan 22 21:51:16 compute-0 sudo[46796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:16 compute-0 python3.9[46798]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118675.2031598-332-139839433267608/.source.conf follow=False _original_basename=registries.conf.j2 checksum=3d06d4b51e1ab18af024121ef8fec31b3fc3dc21 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:51:16 compute-0 sudo[46796]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:17 compute-0 sudo[46948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxhlqsfrrizvwmnjeodpnxlulhnwbxhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118676.590392-380-251486164344277/AnsiballZ_ini_file.py'
Jan 22 21:51:17 compute-0 sudo[46948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:17 compute-0 python3.9[46950]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:51:17 compute-0 sudo[46948]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:17 compute-0 sudo[47100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfeyyxwtuzpqmwrzrzddntpwbmkjjhdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118677.418055-380-95800552308006/AnsiballZ_ini_file.py'
Jan 22 21:51:17 compute-0 sudo[47100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:17 compute-0 python3.9[47102]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:51:17 compute-0 sudo[47100]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:18 compute-0 sudo[47252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtbxbnefjxsdlknesaacshfbytmxkklh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118678.0415375-380-230997939359028/AnsiballZ_ini_file.py'
Jan 22 21:51:18 compute-0 sudo[47252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:18 compute-0 python3.9[47254]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:51:18 compute-0 sudo[47252]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:18 compute-0 sudo[47404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wezpzujgzbscxsvvryxknemmcllxkgun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118678.6794267-380-130062197478235/AnsiballZ_ini_file.py'
Jan 22 21:51:18 compute-0 sudo[47404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:19 compute-0 python3.9[47406]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:51:19 compute-0 sudo[47404]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:20 compute-0 python3.9[47556]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:51:20 compute-0 sudo[47708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riuockaavmfhtvxsrwopxcuxztiwklhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118680.4516466-500-156213280467747/AnsiballZ_dnf.py'
Jan 22 21:51:20 compute-0 sudo[47708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:20 compute-0 python3.9[47710]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:22 compute-0 sudo[47708]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:22 compute-0 sudo[47861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkzeelqjigbjnwlcxuonbgkrxsumxwrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118682.3924735-524-182417952626920/AnsiballZ_dnf.py'
Jan 22 21:51:22 compute-0 sudo[47861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:22 compute-0 python3.9[47863]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:24 compute-0 sudo[47861]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:26 compute-0 sudo[48021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfygjfsbvuuhihmbydfmvzqpfjesqdhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118686.103942-554-256916626184015/AnsiballZ_dnf.py'
Jan 22 21:51:26 compute-0 sudo[48021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:26 compute-0 python3.9[48023]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:27 compute-0 sudo[48021]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:28 compute-0 sudo[48174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ineiqvbhfxsxiwxkkpbnhwquupsfsjep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118688.1552656-581-263180757413908/AnsiballZ_dnf.py'
Jan 22 21:51:28 compute-0 sudo[48174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:28 compute-0 python3.9[48176]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:29 compute-0 sudo[48174]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:30 compute-0 sudo[48327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iibekkxwjzhhmdlguveulodyxphkvfiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118690.4250526-614-126734722315882/AnsiballZ_dnf.py'
Jan 22 21:51:30 compute-0 sudo[48327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:30 compute-0 python3.9[48329]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:32 compute-0 sudo[48327]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:33 compute-0 sudo[48483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmelkuyplwcxutzspfkexwksimflfwri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118693.0904005-638-109240382001626/AnsiballZ_dnf.py'
Jan 22 21:51:33 compute-0 sudo[48483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:33 compute-0 python3.9[48485]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:37 compute-0 sudo[48483]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:38 compute-0 sudo[48652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwnkurdsqaljdxzabkphpfblcsgmjtbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118698.0112717-665-34562908930503/AnsiballZ_dnf.py'
Jan 22 21:51:38 compute-0 sudo[48652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:38 compute-0 python3.9[48655]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:39 compute-0 sudo[48652]: pam_unix(sudo:session): session closed for user root
Jan 22 21:51:40 compute-0 sudo[48806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcngskhpnawkfvfqrpdewdabldwkkzgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118700.3178592-692-176826124691865/AnsiballZ_dnf.py'
Jan 22 21:51:40 compute-0 sudo[48806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:51:40 compute-0 python3.9[48808]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:51:59 compute-0 sudo[48806]: pam_unix(sudo:session): session closed for user root
Jan 22 21:52:36 compute-0 sudo[49142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blttmxtflbquircrdzbphfxjqjsclaea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118756.092097-719-214718493444629/AnsiballZ_dnf.py'
Jan 22 21:52:36 compute-0 sudo[49142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:52:36 compute-0 python3.9[49144]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:52:38 compute-0 sudo[49142]: pam_unix(sudo:session): session closed for user root
Jan 22 21:52:39 compute-0 sudo[49298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kshpgodqhgdmljlkpooqortzqredpuel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118758.7701974-749-25039306391255/AnsiballZ_dnf.py'
Jan 22 21:52:39 compute-0 sudo[49298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:52:39 compute-0 python3.9[49300]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:52:41 compute-0 sudo[49298]: pam_unix(sudo:session): session closed for user root
Jan 22 21:52:42 compute-0 sudo[49455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrekaomucfgukixmkndopiegsmqpwmhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118762.2988212-782-276299416589295/AnsiballZ_file.py'
Jan 22 21:52:42 compute-0 sudo[49455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:52:42 compute-0 python3.9[49457]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:52:42 compute-0 sudo[49455]: pam_unix(sudo:session): session closed for user root
Jan 22 21:52:43 compute-0 sudo[49630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjoyybwzscybqagpqvjckdonbkerrbfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118763.1205733-806-180615824807362/AnsiballZ_stat.py'
Jan 22 21:52:43 compute-0 sudo[49630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:52:43 compute-0 python3.9[49632]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:52:43 compute-0 sudo[49630]: pam_unix(sudo:session): session closed for user root
Jan 22 21:52:44 compute-0 sudo[49753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwivyehynyvmhkwyjezbqsooxcperojp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118763.1205733-806-180615824807362/AnsiballZ_copy.py'
Jan 22 21:52:44 compute-0 sudo[49753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:52:44 compute-0 python3.9[49755]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769118763.1205733-806-180615824807362/.source.json _original_basename=.jfcknq_w follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:52:44 compute-0 sudo[49753]: pam_unix(sudo:session): session closed for user root
Jan 22 21:52:45 compute-0 sudo[49905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adrqphhesltompjpklibaiijwaueywdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118764.8137088-860-126007028617694/AnsiballZ_podman_image.py'
Jan 22 21:52:45 compute-0 sudo[49905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:52:45 compute-0 python3.9[49907]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 21:52:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:52:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat681397709-merged.mount: Deactivated successfully.
Jan 22 21:52:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat681397709-lower\x2dmapped.mount: Deactivated successfully.
Jan 22 21:52:56 compute-0 podman[49919]: 2026-01-22 21:52:56.310436205 +0000 UTC m=+10.609016114 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 21:52:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:52:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:52:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:52:56 compute-0 sudo[49905]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:00 compute-0 sudo[50215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svpfjllerunrnvbrrhjwhychsssdjfyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118779.6721835-899-130184100772172/AnsiballZ_podman_image.py'
Jan 22 21:53:00 compute-0 sudo[50215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:00 compute-0 python3.9[50217]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 21:53:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:21 compute-0 podman[50230]: 2026-01-22 21:53:21.74801057 +0000 UTC m=+21.382203855 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 21:53:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:21 compute-0 sudo[50215]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:36 compute-0 sudo[50526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmkelaqzyeamqemojtvsmjndnwlfaqhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118816.0880542-932-92624667965686/AnsiballZ_podman_image.py'
Jan 22 21:53:36 compute-0 sudo[50526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:36 compute-0 python3.9[50528]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 21:53:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:39 compute-0 podman[50540]: 2026-01-22 21:53:39.501277483 +0000 UTC m=+2.695981739 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 22 21:53:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:39 compute-0 sudo[50526]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:40 compute-0 sudo[50797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qppqauthkjldttvquqecmuupuswvumqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118819.9045792-932-54240747631246/AnsiballZ_podman_image.py'
Jan 22 21:53:40 compute-0 sudo[50797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:40 compute-0 python3.9[50799]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 21:53:41 compute-0 podman[50811]: 2026-01-22 21:53:41.697858181 +0000 UTC m=+1.201282104 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 22 21:53:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:53:41 compute-0 sudo[50797]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:45 compute-0 sshd-session[44910]: Connection closed by 192.168.122.30 port 36878
Jan 22 21:53:45 compute-0 sshd-session[44907]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:53:45 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 22 21:53:45 compute-0 systemd[1]: session-10.scope: Consumed 1min 41.630s CPU time.
Jan 22 21:53:45 compute-0 systemd-logind[801]: Session 10 logged out. Waiting for processes to exit.
Jan 22 21:53:45 compute-0 systemd-logind[801]: Removed session 10.
Jan 22 21:53:51 compute-0 sshd-session[50956]: Accepted publickey for zuul from 192.168.122.30 port 54590 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:53:51 compute-0 systemd-logind[801]: New session 11 of user zuul.
Jan 22 21:53:51 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 22 21:53:51 compute-0 sshd-session[50956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:53:52 compute-0 python3.9[51109]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:53:53 compute-0 sudo[51263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmulzpwiiqlvdchrthoiokklwcjycviq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118833.354791-68-249419355796179/AnsiballZ_getent.py'
Jan 22 21:53:53 compute-0 sudo[51263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:54 compute-0 python3.9[51265]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 22 21:53:54 compute-0 sudo[51263]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:54 compute-0 sudo[51416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpapkcitqletgkvfpzpfdirkfavdobzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118834.2172706-92-126955996330679/AnsiballZ_group.py'
Jan 22 21:53:54 compute-0 sudo[51416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:54 compute-0 python3.9[51418]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 21:53:54 compute-0 groupadd[51419]: group added to /etc/group: name=openvswitch, GID=42476
Jan 22 21:53:54 compute-0 groupadd[51419]: group added to /etc/gshadow: name=openvswitch
Jan 22 21:53:54 compute-0 groupadd[51419]: new group: name=openvswitch, GID=42476
Jan 22 21:53:54 compute-0 sudo[51416]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:55 compute-0 sudo[51574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwsxcywdqcmdoonkzdfxkcrzzbftjxjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118835.1684918-116-246975293513494/AnsiballZ_user.py'
Jan 22 21:53:55 compute-0 sudo[51574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:55 compute-0 python3.9[51576]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 21:53:55 compute-0 useradd[51578]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 21:53:55 compute-0 useradd[51578]: add 'openvswitch' to group 'hugetlbfs'
Jan 22 21:53:55 compute-0 useradd[51578]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 22 21:53:56 compute-0 sudo[51574]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:56 compute-0 sudo[51734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvuodgsweflhtfyktiwzvxsnyiqfrdgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118836.364757-146-281153722176672/AnsiballZ_setup.py'
Jan 22 21:53:56 compute-0 sudo[51734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:57 compute-0 python3.9[51736]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:53:57 compute-0 sudo[51734]: pam_unix(sudo:session): session closed for user root
Jan 22 21:53:57 compute-0 sudo[51818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrujdpmibkyjdwcnhnqataukawfbcqwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118836.364757-146-281153722176672/AnsiballZ_dnf.py'
Jan 22 21:53:57 compute-0 sudo[51818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:53:57 compute-0 python3.9[51820]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:53:59 compute-0 sudo[51818]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:00 compute-0 sudo[51980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woeumdzjhoowrudsjcecyliunnhqerfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118839.841681-188-224960516440874/AnsiballZ_dnf.py'
Jan 22 21:54:00 compute-0 sudo[51980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:00 compute-0 python3.9[51982]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:54:13 compute-0 kernel: SELinux:  Converting 2737 SID table entries...
Jan 22 21:54:13 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 21:54:13 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 21:54:13 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 21:54:13 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 21:54:13 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 21:54:13 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 21:54:13 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 21:54:13 compute-0 groupadd[52006]: group added to /etc/group: name=unbound, GID=994
Jan 22 21:54:13 compute-0 groupadd[52006]: group added to /etc/gshadow: name=unbound
Jan 22 21:54:13 compute-0 groupadd[52006]: new group: name=unbound, GID=994
Jan 22 21:54:13 compute-0 useradd[52013]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 22 21:54:13 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 22 21:54:13 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 22 21:54:15 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 21:54:15 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 21:54:15 compute-0 systemd[1]: Reloading.
Jan 22 21:54:15 compute-0 systemd-rc-local-generator[52503]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:54:15 compute-0 systemd-sysv-generator[52512]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:54:15 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 21:54:16 compute-0 sudo[51980]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:16 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 21:54:16 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 21:54:16 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.003s CPU time.
Jan 22 21:54:16 compute-0 systemd[1]: run-ra71b7a5e268041fda211ddaa64795a5f.service: Deactivated successfully.
Jan 22 21:54:23 compute-0 sudo[53080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvhmfckforfmvtlbdfruokpemhsstbuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118862.467349-212-204128556213294/AnsiballZ_systemd.py'
Jan 22 21:54:23 compute-0 sudo[53080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:23 compute-0 python3.9[53082]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 21:54:23 compute-0 systemd[1]: Reloading.
Jan 22 21:54:23 compute-0 systemd-sysv-generator[53112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:54:23 compute-0 systemd-rc-local-generator[53109]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:54:23 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 22 21:54:23 compute-0 chown[53124]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 22 21:54:23 compute-0 ovs-ctl[53129]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 22 21:54:23 compute-0 ovs-ctl[53129]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 22 21:54:23 compute-0 ovs-ctl[53129]: Starting ovsdb-server [  OK  ]
Jan 22 21:54:23 compute-0 ovs-vsctl[53178]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 22 21:54:24 compute-0 ovs-vsctl[53198]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 22 21:54:24 compute-0 ovs-ctl[53129]: Configuring Open vSwitch system IDs [  OK  ]
Jan 22 21:54:24 compute-0 ovs-vsctl[53203]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 22 21:54:24 compute-0 ovs-ctl[53129]: Enabling remote OVSDB managers [  OK  ]
Jan 22 21:54:24 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 22 21:54:24 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 22 21:54:24 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 22 21:54:24 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 22 21:54:24 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 22 21:54:24 compute-0 ovs-ctl[53249]: Inserting openvswitch module [  OK  ]
Jan 22 21:54:24 compute-0 ovs-ctl[53218]: Starting ovs-vswitchd [  OK  ]
Jan 22 21:54:24 compute-0 ovs-vsctl[53266]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 22 21:54:24 compute-0 ovs-ctl[53218]: Enabling remote OVSDB managers [  OK  ]
Jan 22 21:54:24 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 22 21:54:24 compute-0 systemd[1]: Starting Open vSwitch...
Jan 22 21:54:24 compute-0 systemd[1]: Finished Open vSwitch.
Jan 22 21:54:24 compute-0 sudo[53080]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:25 compute-0 python3.9[53418]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:54:26 compute-0 sudo[53568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efdxukkyusidtlfajdeuxcdrixjctpar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118865.7893317-266-42394241742168/AnsiballZ_sefcontext.py'
Jan 22 21:54:26 compute-0 sudo[53568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:26 compute-0 python3.9[53570]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 22 21:54:28 compute-0 kernel: SELinux:  Converting 2751 SID table entries...
Jan 22 21:54:28 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 21:54:28 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 21:54:28 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 21:54:28 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 21:54:28 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 21:54:28 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 21:54:28 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 21:54:28 compute-0 sudo[53568]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:29 compute-0 python3.9[53725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:54:30 compute-0 sudo[53881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzqfwynujotxudqtampukhsfvqjinjtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118869.7155871-320-42979122639652/AnsiballZ_dnf.py'
Jan 22 21:54:30 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 22 21:54:30 compute-0 sudo[53881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:30 compute-0 python3.9[53883]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:54:31 compute-0 sudo[53881]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:32 compute-0 sudo[54034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwvrzbccurryixgcwewtwhzjqwabaalz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118871.9476764-344-111029024394263/AnsiballZ_command.py'
Jan 22 21:54:32 compute-0 sudo[54034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:32 compute-0 python3.9[54036]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:54:33 compute-0 sudo[54034]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:33 compute-0 sudo[54321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkfofcqhewxmnbvyaqqljosrsfvmrnau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118873.5379498-368-29700146841737/AnsiballZ_file.py'
Jan 22 21:54:33 compute-0 sudo[54321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:34 compute-0 python3.9[54323]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 22 21:54:34 compute-0 sudo[54321]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:35 compute-0 python3.9[54473]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:54:35 compute-0 sudo[54625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cepoelwfwdakhvkdycgksjqzlkiznmdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118875.291023-416-254581144463416/AnsiballZ_dnf.py'
Jan 22 21:54:35 compute-0 sudo[54625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:35 compute-0 python3.9[54627]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:54:37 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 21:54:37 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 21:54:37 compute-0 systemd[1]: Reloading.
Jan 22 21:54:37 compute-0 systemd-rc-local-generator[54665]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:54:37 compute-0 systemd-sysv-generator[54671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:54:37 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 21:54:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 21:54:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 21:54:38 compute-0 systemd[1]: run-r29d94f68b726407c9fd6d5f28b5e0a57.service: Deactivated successfully.
Jan 22 21:54:38 compute-0 sudo[54625]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:38 compute-0 sudo[54942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoxyntbylniidvrlzqwnxdroltfktbgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118878.4706883-440-112223888775175/AnsiballZ_systemd.py'
Jan 22 21:54:38 compute-0 sudo[54942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:39 compute-0 python3.9[54944]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:54:39 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 21:54:39 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 22 21:54:39 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 22 21:54:39 compute-0 systemd[1]: Stopping Network Manager...
Jan 22 21:54:39 compute-0 NetworkManager[7195]: <info>  [1769118879.2133] caught SIGTERM, shutting down normally.
Jan 22 21:54:39 compute-0 NetworkManager[7195]: <info>  [1769118879.2162] dhcp4 (eth0): canceled DHCP transaction
Jan 22 21:54:39 compute-0 NetworkManager[7195]: <info>  [1769118879.2163] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:54:39 compute-0 NetworkManager[7195]: <info>  [1769118879.2163] dhcp4 (eth0): state changed no lease
Jan 22 21:54:39 compute-0 NetworkManager[7195]: <info>  [1769118879.2172] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 21:54:39 compute-0 NetworkManager[7195]: <info>  [1769118879.2255] exiting (success)
Jan 22 21:54:39 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 21:54:39 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 21:54:39 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 21:54:39 compute-0 systemd[1]: Stopped Network Manager.
Jan 22 21:54:39 compute-0 systemd[1]: NetworkManager.service: Consumed 17.439s CPU time, 4.1M memory peak, read 0B from disk, written 27.5K to disk.
Jan 22 21:54:39 compute-0 systemd[1]: Starting Network Manager...
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.2822] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:7bdd0997-5020-422e-9e39-85d77ba7ab4a)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.2823] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.2885] manager[0x5611ff9e2000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 21:54:39 compute-0 systemd[1]: Starting Hostname Service...
Jan 22 21:54:39 compute-0 systemd[1]: Started Hostname Service.
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3911] hostname: hostname: using hostnamed
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3913] hostname: static hostname changed from (none) to "compute-0"
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3919] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3924] manager[0x5611ff9e2000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3924] manager[0x5611ff9e2000]: rfkill: WWAN hardware radio set enabled
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3949] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3960] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3961] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3962] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3962] manager: Networking is enabled by state file
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3964] settings: Loaded settings plugin: keyfile (internal)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3968] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.3996] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4005] dhcp: init: Using DHCP client 'internal'
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4008] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4015] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4021] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4029] device (lo): Activation: starting connection 'lo' (b467b8bc-34e9-40e6-be6e-e31b90d2564c)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4036] device (eth0): carrier: link connected
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4041] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4046] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4046] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4053] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4060] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4067] device (eth1): carrier: link connected
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4071] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4077] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (8e0b80b2-e493-5867-80c6-016d3ab3d59e) (indicated)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4078] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4083] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4091] device (eth1): Activation: starting connection 'ci-private-network' (8e0b80b2-e493-5867-80c6-016d3ab3d59e)
Jan 22 21:54:39 compute-0 systemd[1]: Started Network Manager.
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4099] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4121] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4126] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4129] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4132] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4136] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4140] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4144] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4150] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4160] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4165] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4178] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4198] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4212] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4215] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4223] device (lo): Activation: successful, device activated.
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4234] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4258] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4370] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4383] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4387] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4394] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4399] device (eth1): Activation: successful, device activated.
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4441] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4444] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4451] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4456] device (eth0): Activation: successful, device activated.
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4464] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 21:54:39 compute-0 NetworkManager[54954]: <info>  [1769118879.4473] manager: startup complete
Jan 22 21:54:39 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 22 21:54:39 compute-0 sudo[54942]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:40 compute-0 sudo[55169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnqiyveojrsycicmeiivgqckvhcqrogr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118879.8896027-464-159590961453746/AnsiballZ_dnf.py'
Jan 22 21:54:40 compute-0 sudo[55169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:40 compute-0 python3.9[55171]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:54:44 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 21:54:44 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 21:54:44 compute-0 systemd[1]: Reloading.
Jan 22 21:54:45 compute-0 systemd-rc-local-generator[55222]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:54:45 compute-0 systemd-sysv-generator[55227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:54:45 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 21:54:45 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 21:54:45 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 21:54:45 compute-0 systemd[1]: run-rf54f054dd8d142c0a647e44fff5c6792.service: Deactivated successfully.
Jan 22 21:54:46 compute-0 sudo[55169]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:49 compute-0 sudo[55629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqggpwcrbhmdnvskjssarvrvyduwramn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118888.7714415-500-120636022514742/AnsiballZ_stat.py'
Jan 22 21:54:49 compute-0 sudo[55629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:49 compute-0 python3.9[55631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:54:49 compute-0 sudo[55629]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:49 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 21:54:49 compute-0 sudo[55781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxoxtejykfdaetkvjtriqjuhcghxmzts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118889.5460906-527-126489524265541/AnsiballZ_ini_file.py'
Jan 22 21:54:49 compute-0 sudo[55781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:50 compute-0 python3.9[55783]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:50 compute-0 sudo[55781]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:50 compute-0 sudo[55935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrpffndwlkdwkbiwcwpebudbqyzwqogj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118890.5509915-557-192235124874946/AnsiballZ_ini_file.py'
Jan 22 21:54:50 compute-0 sudo[55935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:51 compute-0 python3.9[55937]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:51 compute-0 sudo[55935]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:51 compute-0 sudo[56087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbwxygsktyurvjsivlqfyoynkmyyuwfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118891.2205782-557-136075535327485/AnsiballZ_ini_file.py'
Jan 22 21:54:51 compute-0 sudo[56087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:51 compute-0 python3.9[56089]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:51 compute-0 sudo[56087]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:52 compute-0 sudo[56239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrlvixpjewnpzjbtuunqpphvereraim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118891.9525855-602-128790681500835/AnsiballZ_ini_file.py'
Jan 22 21:54:52 compute-0 sudo[56239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:52 compute-0 python3.9[56241]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:52 compute-0 sudo[56239]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:52 compute-0 sudo[56391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gechfdkghwwlugliobicwzsgsthwchgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118892.6254194-602-80618231116391/AnsiballZ_ini_file.py'
Jan 22 21:54:52 compute-0 sudo[56391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:53 compute-0 python3.9[56393]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:53 compute-0 sudo[56391]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:53 compute-0 sudo[56543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmjjdgfoldyvsycrmnjogdpkxkgjltt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118893.356791-647-112085081081604/AnsiballZ_stat.py'
Jan 22 21:54:53 compute-0 sudo[56543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:53 compute-0 python3.9[56545]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:54:53 compute-0 sudo[56543]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:54 compute-0 sudo[56666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygeehsfennejiidamybrqydmmmugrntk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118893.356791-647-112085081081604/AnsiballZ_copy.py'
Jan 22 21:54:54 compute-0 sudo[56666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:54 compute-0 python3.9[56668]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118893.356791-647-112085081081604/.source _original_basename=.c_3gk9tl follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:54 compute-0 sudo[56666]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:54 compute-0 sudo[56818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnlhrbzgcbgzpmhvnvybyygdkmkulwzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118894.6726515-692-120999904938228/AnsiballZ_file.py'
Jan 22 21:54:54 compute-0 sudo[56818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:55 compute-0 python3.9[56820]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:55 compute-0 sudo[56818]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:55 compute-0 sudo[56970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kidhaoqwnfxhifdmisqxmjwxeazrbtqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118895.337806-716-182509066381747/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 22 21:54:55 compute-0 sudo[56970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:55 compute-0 python3.9[56972]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 22 21:54:55 compute-0 sudo[56970]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:56 compute-0 sudo[57122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qserazubivijewonhklwvcwqtbffrepf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118896.2477856-743-190481832970711/AnsiballZ_file.py'
Jan 22 21:54:56 compute-0 sudo[57122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:56 compute-0 python3.9[57124]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:54:56 compute-0 sudo[57122]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:57 compute-0 sudo[57274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkvyxpojnwmejfwsffcixhoxjyaoqvqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118897.1335838-773-152835741653826/AnsiballZ_stat.py'
Jan 22 21:54:57 compute-0 sudo[57274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:57 compute-0 sudo[57274]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:58 compute-0 sudo[57397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqrsotzruqblhmbroliihgequochkfgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118897.1335838-773-152835741653826/AnsiballZ_copy.py'
Jan 22 21:54:58 compute-0 sudo[57397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:58 compute-0 sudo[57397]: pam_unix(sudo:session): session closed for user root
Jan 22 21:54:58 compute-0 sudo[57549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkgdsvoeppdclddswfaekqfvimcxbndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118898.479538-818-71253293434988/AnsiballZ_slurp.py'
Jan 22 21:54:58 compute-0 sudo[57549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:54:59 compute-0 python3.9[57551]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 22 21:54:59 compute-0 sudo[57549]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:00 compute-0 sudo[57724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxrqsghbeizdodezoqbwkiieoeiaphag ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118899.4072025-845-14336471377905/async_wrapper.py j64151130745 300 /home/zuul/.ansible/tmp/ansible-tmp-1769118899.4072025-845-14336471377905/AnsiballZ_edpm_os_net_config.py _'
Jan 22 21:55:00 compute-0 sudo[57724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:00 compute-0 ansible-async_wrapper.py[57726]: Invoked with j64151130745 300 /home/zuul/.ansible/tmp/ansible-tmp-1769118899.4072025-845-14336471377905/AnsiballZ_edpm_os_net_config.py _
Jan 22 21:55:00 compute-0 ansible-async_wrapper.py[57729]: Starting module and watcher
Jan 22 21:55:00 compute-0 ansible-async_wrapper.py[57729]: Start watching 57730 (300)
Jan 22 21:55:00 compute-0 ansible-async_wrapper.py[57730]: Start module (57730)
Jan 22 21:55:00 compute-0 ansible-async_wrapper.py[57726]: Return async_wrapper task started.
Jan 22 21:55:00 compute-0 sudo[57724]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:00 compute-0 python3.9[57731]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 22 21:55:01 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 22 21:55:01 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 22 21:55:01 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 22 21:55:01 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 22 21:55:01 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.6627] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.6654] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7528] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7529] audit: op="connection-add" uuid="ff65f5d8-c578-495d-9409-8c65e0c0b785" name="br-ex-br" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7557] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7558] audit: op="connection-add" uuid="d1d9c63c-7894-4b78-93e7-5a52a5b7fd9f" name="br-ex-port" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7579] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7582] audit: op="connection-add" uuid="0219da10-317f-4f94-9d49-8a2596fe5b0d" name="eth1-port" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7603] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7606] audit: op="connection-add" uuid="18529efe-43b6-4071-b3d7-286ae573f7a7" name="vlan20-port" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7627] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7629] audit: op="connection-add" uuid="c465aded-c644-46ff-920b-ef3c4450ab7a" name="vlan21-port" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7649] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7651] audit: op="connection-add" uuid="4747c090-861f-47e1-b52b-e0efa518959d" name="vlan22-port" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7682] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7706] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7708] audit: op="connection-add" uuid="2a8ed92d-1e08-49a1-b0d3-cf40fb99a5fc" name="br-ex-if" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7789] audit: op="connection-update" uuid="8e0b80b2-e493-5867-80c6-016d3ab3d59e" name="ci-private-network" args="connection.slave-type,connection.controller,connection.port-type,connection.master,connection.timestamp,ovs-interface.type,ipv6.routes,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,ipv6.method,ipv6.addresses,ipv4.routes,ipv4.method,ipv4.dns,ipv4.routing-rules,ipv4.never-default,ipv4.addresses,ovs-external-ids.data" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7815] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7819] audit: op="connection-add" uuid="f6da1703-d13c-4b27-8bb6-86c1d4911ac9" name="vlan20-if" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7847] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7850] audit: op="connection-add" uuid="ab9bacb7-57b8-4d99-9842-2f3bc7163f9e" name="vlan21-if" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7874] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7877] audit: op="connection-add" uuid="3e755b97-8491-462b-9fde-fe78efd2136e" name="vlan22-if" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7898] audit: op="connection-delete" uuid="eb7d8611-2061-33cb-9530-bc24d43f8bde" name="Wired connection 1" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7921] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.7923] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7935] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7942] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ff65f5d8-c578-495d-9409-8c65e0c0b785)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7945] audit: op="connection-activate" uuid="ff65f5d8-c578-495d-9409-8c65e0c0b785" name="br-ex-br" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7948] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.7950] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7959] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7966] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (d1d9c63c-7894-4b78-93e7-5a52a5b7fd9f)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7969] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.7971] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7979] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7986] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (0219da10-317f-4f94-9d49-8a2596fe5b0d)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7989] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.7990] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.7997] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8003] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (18529efe-43b6-4071-b3d7-286ae573f7a7)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8006] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.8007] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8013] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8019] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (c465aded-c644-46ff-920b-ef3c4450ab7a)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8021] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.8022] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8029] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8035] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (4747c090-861f-47e1-b52b-e0efa518959d)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8036] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8039] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8042] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8049] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.8050] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8056] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8062] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (2a8ed92d-1e08-49a1-b0d3-cf40fb99a5fc)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8063] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8068] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8070] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8071] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8073] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8087] device (eth1): disconnecting for new activation request.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8088] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8091] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8094] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8097] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8102] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.8103] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8107] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8113] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f6da1703-d13c-4b27-8bb6-86c1d4911ac9)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8114] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8118] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8120] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8121] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8125] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.8127] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8131] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8137] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (ab9bacb7-57b8-4d99-9842-2f3bc7163f9e)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8138] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8142] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8144] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8145] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8149] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <warn>  [1769118902.8150] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8154] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8160] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (3e755b97-8491-462b-9fde-fe78efd2136e)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8160] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8164] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8166] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8168] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8170] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8186] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8190] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8195] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8196] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8205] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8210] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8216] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8220] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8222] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8242] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8246] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 21:55:02 compute-0 kernel: Timeout policy base is empty
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8250] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8252] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 systemd-udevd[57737]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8258] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8262] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8265] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8277] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8284] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8291] dhcp4 (eth0): canceled DHCP transaction
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8291] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8292] dhcp4 (eth0): state changed no lease
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8293] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8306] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8310] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57732 uid=0 result="fail" reason="Device is not activated"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8344] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8352] device (eth1): disconnecting for new activation request.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8352] audit: op="connection-activate" uuid="8e0b80b2-e493-5867-80c6-016d3ab3d59e" name="ci-private-network" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8354] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8363] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8365] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 22 21:55:02 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8416] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8545] device (eth1): Activation: starting connection 'ci-private-network' (8e0b80b2-e493-5867-80c6-016d3ab3d59e)
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8551] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8553] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57732 uid=0 result="success"
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8561] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8568] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8574] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8590] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8597] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8599] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8600] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8602] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8604] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8608] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8616] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8630] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8635] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8641] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8646] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8651] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8656] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8662] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8667] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8672] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8679] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8685] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8725] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8727] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8735] device (eth1): Activation: successful, device activated.
Jan 22 21:55:02 compute-0 kernel: br-ex: entered promiscuous mode
Jan 22 21:55:02 compute-0 kernel: vlan22: entered promiscuous mode
Jan 22 21:55:02 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 22 21:55:02 compute-0 systemd-udevd[57738]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8905] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8923] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 kernel: vlan21: entered promiscuous mode
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8944] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8945] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.8951] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 kernel: vlan20: entered promiscuous mode
Jan 22 21:55:02 compute-0 systemd-udevd[57736]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9014] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9024] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9043] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9045] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9051] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9062] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9074] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9113] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9115] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9119] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9159] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9173] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9188] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9189] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 21:55:02 compute-0 NetworkManager[54954]: <info>  [1769118902.9193] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 21:55:03 compute-0 sudo[58063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzuifjbastnmpeutlmnhtcmaztfgqpkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118903.499135-845-244378432174918/AnsiballZ_async_status.py'
Jan 22 21:55:03 compute-0 sudo[58063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.0406] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57732 uid=0 result="success"
Jan 22 21:55:04 compute-0 python3.9[58065]: ansible-ansible.legacy.async_status Invoked with jid=j64151130745.57726 mode=status _async_dir=/root/.ansible_async
Jan 22 21:55:04 compute-0 sudo[58063]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.2550] checkpoint[0x5611ff9b7950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.2551] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57732 uid=0 result="success"
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.5442] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57732 uid=0 result="success"
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.5458] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57732 uid=0 result="success"
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.8136] audit: op="networking-control" arg="global-dns-configuration" pid=57732 uid=0 result="success"
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.8165] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.8203] audit: op="networking-control" arg="global-dns-configuration" pid=57732 uid=0 result="success"
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.8221] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57732 uid=0 result="success"
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.9458] checkpoint[0x5611ff9b7a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 22 21:55:04 compute-0 NetworkManager[54954]: <info>  [1769118904.9460] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57732 uid=0 result="success"
Jan 22 21:55:05 compute-0 ansible-async_wrapper.py[57730]: Module complete (57730)
Jan 22 21:55:05 compute-0 ansible-async_wrapper.py[57729]: Done in kid B.
Jan 22 21:55:07 compute-0 sudo[58169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eovnfctzzmwsucqpyptdxsxhihyepowg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118903.499135-845-244378432174918/AnsiballZ_async_status.py'
Jan 22 21:55:07 compute-0 sudo[58169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:07 compute-0 python3.9[58171]: ansible-ansible.legacy.async_status Invoked with jid=j64151130745.57726 mode=status _async_dir=/root/.ansible_async
Jan 22 21:55:07 compute-0 sudo[58169]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:07 compute-0 sudo[58269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcryuestxmnkwxhcfyurvthilochbiok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118903.499135-845-244378432174918/AnsiballZ_async_status.py'
Jan 22 21:55:07 compute-0 sudo[58269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:08 compute-0 python3.9[58271]: ansible-ansible.legacy.async_status Invoked with jid=j64151130745.57726 mode=cleanup _async_dir=/root/.ansible_async
Jan 22 21:55:08 compute-0 sudo[58269]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:08 compute-0 sudo[58421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejvulrdwynisibpcaiwtnqjpqqpbajzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118908.4462006-921-257946115563963/AnsiballZ_stat.py'
Jan 22 21:55:08 compute-0 sudo[58421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:08 compute-0 python3.9[58423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:55:08 compute-0 sudo[58421]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:09 compute-0 sudo[58544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knudvetysmqozrjknlswccjxmdbjilfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118908.4462006-921-257946115563963/AnsiballZ_copy.py'
Jan 22 21:55:09 compute-0 sudo[58544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:09 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 21:55:09 compute-0 python3.9[58546]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118908.4462006-921-257946115563963/.source.returncode _original_basename=.lv20kv4s follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:55:09 compute-0 sudo[58544]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:12 compute-0 sudo[58699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iysduiogsexloarkmkhtfggywretahwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118911.912824-969-68905942030064/AnsiballZ_stat.py'
Jan 22 21:55:12 compute-0 sudo[58699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:12 compute-0 python3.9[58701]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:55:12 compute-0 sudo[58699]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:12 compute-0 sudo[58822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccdvaawiedilzqskqzmkueqintrourty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118911.912824-969-68905942030064/AnsiballZ_copy.py'
Jan 22 21:55:12 compute-0 sudo[58822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:12 compute-0 python3.9[58824]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118911.912824-969-68905942030064/.source.cfg _original_basename=.becwpmuq follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:55:12 compute-0 sudo[58822]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:13 compute-0 sudo[58974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yorcgpgstbsiabpmcwwzchmvpkekuvku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118913.3095412-1014-59462809031937/AnsiballZ_systemd.py'
Jan 22 21:55:13 compute-0 sudo[58974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:13 compute-0 python3.9[58976]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:55:13 compute-0 systemd[1]: Reloading Network Manager...
Jan 22 21:55:14 compute-0 NetworkManager[54954]: <info>  [1769118914.0059] audit: op="reload" arg="0" pid=58980 uid=0 result="success"
Jan 22 21:55:14 compute-0 NetworkManager[54954]: <info>  [1769118914.0069] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 22 21:55:14 compute-0 systemd[1]: Reloaded Network Manager.
Jan 22 21:55:14 compute-0 sudo[58974]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:14 compute-0 sshd-session[50959]: Connection closed by 192.168.122.30 port 54590
Jan 22 21:55:14 compute-0 sshd-session[50956]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:55:14 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 22 21:55:14 compute-0 systemd-logind[801]: Session 11 logged out. Waiting for processes to exit.
Jan 22 21:55:14 compute-0 systemd[1]: session-11.scope: Consumed 54.013s CPU time.
Jan 22 21:55:14 compute-0 systemd-logind[801]: Removed session 11.
Jan 22 21:55:19 compute-0 sshd-session[59011]: Accepted publickey for zuul from 192.168.122.30 port 47940 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:55:19 compute-0 systemd-logind[801]: New session 12 of user zuul.
Jan 22 21:55:19 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 22 21:55:19 compute-0 sshd-session[59011]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:55:20 compute-0 python3.9[59165]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:55:22 compute-0 python3.9[59319]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:55:23 compute-0 python3.9[59508]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:55:23 compute-0 sshd-session[59014]: Connection closed by 192.168.122.30 port 47940
Jan 22 21:55:23 compute-0 sshd-session[59011]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:55:23 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 22 21:55:23 compute-0 systemd[1]: session-12.scope: Consumed 2.727s CPU time.
Jan 22 21:55:23 compute-0 systemd-logind[801]: Session 12 logged out. Waiting for processes to exit.
Jan 22 21:55:23 compute-0 systemd-logind[801]: Removed session 12.
Jan 22 21:55:24 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 21:55:29 compute-0 sshd-session[59537]: Accepted publickey for zuul from 192.168.122.30 port 41588 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:55:29 compute-0 systemd-logind[801]: New session 13 of user zuul.
Jan 22 21:55:29 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 22 21:55:29 compute-0 sshd-session[59537]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:55:30 compute-0 python3.9[59691]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:55:31 compute-0 python3.9[59845]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:55:32 compute-0 sudo[59999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nltutjomwewbinneuvkuzcwwderbzayh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118931.9541934-80-274708415725227/AnsiballZ_setup.py'
Jan 22 21:55:32 compute-0 sudo[59999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:32 compute-0 python3.9[60001]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:55:32 compute-0 sudo[59999]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:33 compute-0 sudo[60083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-farlpcoirwgwvxkwthmxjetntjootcve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118931.9541934-80-274708415725227/AnsiballZ_dnf.py'
Jan 22 21:55:33 compute-0 sudo[60083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:33 compute-0 python3.9[60085]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:55:34 compute-0 sudo[60083]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:35 compute-0 sudo[60237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuzyuqtqqqvztpwstazdgstfywcybsyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118935.0462892-116-243572820425794/AnsiballZ_setup.py'
Jan 22 21:55:35 compute-0 sudo[60237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:35 compute-0 python3.9[60239]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:55:35 compute-0 sudo[60237]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:36 compute-0 sudo[60428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjvaqwrxrlmbtzpsmcwzhyxeusldfxkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118936.24183-149-233863772550511/AnsiballZ_file.py'
Jan 22 21:55:36 compute-0 sudo[60428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:36 compute-0 python3.9[60430]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:55:36 compute-0 sudo[60428]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:37 compute-0 sudo[60580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eafimoxlnbdohnhlfhtdwftlpykrknwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118937.0522666-173-30374948548898/AnsiballZ_command.py'
Jan 22 21:55:37 compute-0 sudo[60580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:37 compute-0 python3.9[60582]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:55:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:55:37 compute-0 sudo[60580]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:38 compute-0 sudo[60743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpinmjbmmqoirvbwaaottudvdekgegbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118938.0167704-197-13477502077761/AnsiballZ_stat.py'
Jan 22 21:55:38 compute-0 sudo[60743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:38 compute-0 python3.9[60745]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:55:38 compute-0 sudo[60743]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:39 compute-0 sudo[60821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asyhpiecucmvhwdvhgazpbdldpccpptl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118938.0167704-197-13477502077761/AnsiballZ_file.py'
Jan 22 21:55:39 compute-0 sudo[60821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:39 compute-0 python3.9[60823]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:55:39 compute-0 sudo[60821]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:39 compute-0 sudo[60973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilaohtvuhjmlprlogpquhaatpxlbqkwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118939.6122732-233-211197211709563/AnsiballZ_stat.py'
Jan 22 21:55:39 compute-0 sudo[60973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:40 compute-0 python3.9[60975]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:55:40 compute-0 sudo[60973]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:40 compute-0 sudo[61051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-korbsvxbhadqfoqaqgjxkhxxgbhkwxmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118939.6122732-233-211197211709563/AnsiballZ_file.py'
Jan 22 21:55:40 compute-0 sudo[61051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:40 compute-0 python3.9[61053]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:55:40 compute-0 sudo[61051]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:41 compute-0 sudo[61203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emzxilpzzvtgaifmmntlxirilazrqata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118940.8742726-272-128458752383137/AnsiballZ_ini_file.py'
Jan 22 21:55:41 compute-0 sudo[61203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:41 compute-0 python3.9[61205]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:55:41 compute-0 sudo[61203]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:42 compute-0 sudo[61355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdkulyxcvvbxsuvvgikoawsrqqgjqgcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118941.6793065-272-253683060017665/AnsiballZ_ini_file.py'
Jan 22 21:55:42 compute-0 sudo[61355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:42 compute-0 python3.9[61357]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:55:42 compute-0 sudo[61355]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:42 compute-0 sudo[61507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfinhgpfruxvykacaqfkbexbsclxzyof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118942.4249196-272-55819880333679/AnsiballZ_ini_file.py'
Jan 22 21:55:42 compute-0 sudo[61507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:43 compute-0 python3.9[61509]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:55:43 compute-0 sudo[61507]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:43 compute-0 sudo[61659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfameeqycxkzjlnwamxitrnfisaqszis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118943.192053-272-13104944242208/AnsiballZ_ini_file.py'
Jan 22 21:55:43 compute-0 sudo[61659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:43 compute-0 python3.9[61661]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:55:43 compute-0 sudo[61659]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:44 compute-0 sudo[61811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxkououlgslihhxdywwhcrobcbpvpwmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118944.0533118-365-176658580167898/AnsiballZ_dnf.py'
Jan 22 21:55:44 compute-0 sudo[61811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:44 compute-0 python3.9[61813]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:55:45 compute-0 sudo[61811]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:47 compute-0 sudo[61964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpuqyrdmbgwhwjklkklufsvfmthtfaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118946.705557-398-174370700430412/AnsiballZ_setup.py'
Jan 22 21:55:47 compute-0 sudo[61964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:47 compute-0 python3.9[61966]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:55:47 compute-0 sudo[61964]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:47 compute-0 sudo[62118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiprtriovtiygcduydigzbmepqxqwywn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118947.5693743-422-143730671099883/AnsiballZ_stat.py'
Jan 22 21:55:47 compute-0 sudo[62118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:48 compute-0 python3.9[62120]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:55:48 compute-0 sudo[62118]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:48 compute-0 sudo[62270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkcnriisntolhejvckoetdgvxgkviwfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118948.3600614-449-245510110807800/AnsiballZ_stat.py'
Jan 22 21:55:48 compute-0 sudo[62270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:48 compute-0 python3.9[62272]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:55:48 compute-0 sudo[62270]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:49 compute-0 sudo[62422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmzwtqqicmjsayynrhsddqrdjawpehwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118949.2181072-479-10748552318995/AnsiballZ_command.py'
Jan 22 21:55:49 compute-0 sudo[62422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:49 compute-0 python3.9[62424]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:55:49 compute-0 sudo[62422]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:50 compute-0 sudo[62575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnbkomxwygnimwlwmzetgwsdaqegivkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118950.1505334-509-40965360933999/AnsiballZ_service_facts.py'
Jan 22 21:55:50 compute-0 sudo[62575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:50 compute-0 python3.9[62577]: ansible-service_facts Invoked
Jan 22 21:55:50 compute-0 network[62594]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 21:55:50 compute-0 network[62595]: 'network-scripts' will be removed from distribution in near future.
Jan 22 21:55:50 compute-0 network[62596]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 21:55:55 compute-0 sudo[62575]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:56 compute-0 sudo[62879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fakwatkntohjtxlonniagafljgylttyk ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769118955.8808024-554-179782860945398/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769118955.8808024-554-179782860945398/args'
Jan 22 21:55:56 compute-0 sudo[62879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:56 compute-0 sudo[62879]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:56 compute-0 sudo[63046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxxzsckyzyizmqlfstauvjvvuizlcvze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118956.57026-587-209803161887262/AnsiballZ_dnf.py'
Jan 22 21:55:56 compute-0 sudo[63046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:57 compute-0 python3.9[63048]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:55:58 compute-0 sudo[63046]: pam_unix(sudo:session): session closed for user root
Jan 22 21:55:59 compute-0 sudo[63199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhflnwnbwpngxpbiqgkdjgabqcxpvrxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118958.9816918-626-156932299654508/AnsiballZ_package_facts.py'
Jan 22 21:55:59 compute-0 sudo[63199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:55:59 compute-0 python3.9[63201]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 22 21:56:00 compute-0 sudo[63199]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:01 compute-0 sudo[63351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lawjtobmcpynuwcxhwweyvkevllvsanf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118960.809336-656-237597398394500/AnsiballZ_stat.py'
Jan 22 21:56:01 compute-0 sudo[63351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:01 compute-0 python3.9[63353]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:01 compute-0 sudo[63351]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:01 compute-0 sudo[63476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fowjrjfcucvggixngzcnhfuxjhfosrzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118960.809336-656-237597398394500/AnsiballZ_copy.py'
Jan 22 21:56:01 compute-0 sudo[63476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:02 compute-0 python3.9[63478]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118960.809336-656-237597398394500/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:02 compute-0 sudo[63476]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:02 compute-0 sudo[63630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puowdzstcmjpecieyqdkjwshtiijnymp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118962.2532732-701-232895361563908/AnsiballZ_stat.py'
Jan 22 21:56:02 compute-0 sudo[63630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:02 compute-0 python3.9[63632]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:02 compute-0 sudo[63630]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:03 compute-0 sudo[63755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfzicuwosvthujanmdcnrrjzbsgxauot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118962.2532732-701-232895361563908/AnsiballZ_copy.py'
Jan 22 21:56:03 compute-0 sudo[63755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:03 compute-0 python3.9[63757]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118962.2532732-701-232895361563908/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:04 compute-0 sudo[63755]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:05 compute-0 sudo[63909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opqklgnadrxjzfgivcfpkorgctupvjqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118965.323939-764-210656287176819/AnsiballZ_lineinfile.py'
Jan 22 21:56:05 compute-0 sudo[63909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:06 compute-0 python3.9[63911]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:06 compute-0 sudo[63909]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:07 compute-0 sudo[64063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwhzxkgjdwulcebnrmqfacgwunkkrzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118967.1757197-809-60874617788260/AnsiballZ_setup.py'
Jan 22 21:56:07 compute-0 sudo[64063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:07 compute-0 python3.9[64065]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:56:08 compute-0 sudo[64063]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:08 compute-0 sudo[64147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhegfeiewmyelkwjncprmnutkhokchib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118967.1757197-809-60874617788260/AnsiballZ_systemd.py'
Jan 22 21:56:08 compute-0 sudo[64147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:08 compute-0 python3.9[64149]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:56:08 compute-0 sudo[64147]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:11 compute-0 sudo[64301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbbbarizidzryohiamfptwdncgtldcbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118970.8342113-857-44116784765935/AnsiballZ_setup.py'
Jan 22 21:56:11 compute-0 sudo[64301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:11 compute-0 python3.9[64303]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:56:11 compute-0 sudo[64301]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:12 compute-0 sudo[64385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wygbvzwbfmucszhatyffmtukfcealjne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118970.8342113-857-44116784765935/AnsiballZ_systemd.py'
Jan 22 21:56:12 compute-0 sudo[64385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:12 compute-0 python3.9[64387]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:56:12 compute-0 chronyd[786]: chronyd exiting
Jan 22 21:56:12 compute-0 systemd[1]: Stopping NTP client/server...
Jan 22 21:56:12 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 22 21:56:12 compute-0 systemd[1]: Stopped NTP client/server.
Jan 22 21:56:12 compute-0 systemd[1]: Starting NTP client/server...
Jan 22 21:56:12 compute-0 chronyd[64395]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 21:56:12 compute-0 chronyd[64395]: Frequency -24.805 +/- 0.351 ppm read from /var/lib/chrony/drift
Jan 22 21:56:12 compute-0 chronyd[64395]: Loaded seccomp filter (level 2)
Jan 22 21:56:12 compute-0 systemd[1]: Started NTP client/server.
Jan 22 21:56:12 compute-0 sudo[64385]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:12 compute-0 sshd-session[59540]: Connection closed by 192.168.122.30 port 41588
Jan 22 21:56:12 compute-0 sshd-session[59537]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:56:12 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 22 21:56:12 compute-0 systemd[1]: session-13.scope: Consumed 28.094s CPU time.
Jan 22 21:56:12 compute-0 systemd-logind[801]: Session 13 logged out. Waiting for processes to exit.
Jan 22 21:56:12 compute-0 systemd-logind[801]: Removed session 13.
Jan 22 21:56:18 compute-0 sshd-session[64421]: Accepted publickey for zuul from 192.168.122.30 port 36982 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:56:18 compute-0 systemd-logind[801]: New session 14 of user zuul.
Jan 22 21:56:18 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 22 21:56:18 compute-0 sshd-session[64421]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:56:19 compute-0 python3.9[64574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:56:20 compute-0 sudo[64728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stobnpunsckmeefgitnvtxxcupwqooat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118980.2941775-59-201041417363490/AnsiballZ_file.py'
Jan 22 21:56:20 compute-0 sudo[64728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:20 compute-0 python3.9[64730]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:21 compute-0 sudo[64728]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:21 compute-0 sudo[64903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiwaatqbbygzvqgcgnocpsfeyrklrydf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118981.1929939-83-126865454346706/AnsiballZ_stat.py'
Jan 22 21:56:21 compute-0 sudo[64903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:21 compute-0 python3.9[64905]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:21 compute-0 sudo[64903]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:22 compute-0 sudo[64981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iccbgrrkrcmptouudatqeyisgnrqpryj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118981.1929939-83-126865454346706/AnsiballZ_file.py'
Jan 22 21:56:22 compute-0 sudo[64981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:22 compute-0 python3.9[64983]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.d3bryjga recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:22 compute-0 sudo[64981]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:23 compute-0 sudo[65133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnmscrydqgiebrdxcukpymhvaczbfjar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118982.8330615-143-173513630281325/AnsiballZ_stat.py'
Jan 22 21:56:23 compute-0 sudo[65133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:23 compute-0 python3.9[65135]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:23 compute-0 sudo[65133]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:23 compute-0 sudo[65256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrvhrdxzxqevopxwojhnbjmhvptrkhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118982.8330615-143-173513630281325/AnsiballZ_copy.py'
Jan 22 21:56:23 compute-0 sudo[65256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:24 compute-0 python3.9[65258]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118982.8330615-143-173513630281325/.source _original_basename=.rxgmy8e9 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:24 compute-0 sudo[65256]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:24 compute-0 sudo[65408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwvghosfwmcgifpbwqjfcwuyvmpdfezp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118984.2252645-191-170290848138653/AnsiballZ_file.py'
Jan 22 21:56:24 compute-0 sudo[65408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:24 compute-0 python3.9[65410]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:56:24 compute-0 sudo[65408]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:25 compute-0 sudo[65560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffamyhmmdoqzywfzmtaucrkrfnnkzolx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118984.924681-215-182375076011587/AnsiballZ_stat.py'
Jan 22 21:56:25 compute-0 sudo[65560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:25 compute-0 python3.9[65562]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:25 compute-0 sudo[65560]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:25 compute-0 sudo[65683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtgytanzbbwsmkhaadungekxvtanizfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118984.924681-215-182375076011587/AnsiballZ_copy.py'
Jan 22 21:56:25 compute-0 sudo[65683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:26 compute-0 python3.9[65685]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118984.924681-215-182375076011587/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:56:26 compute-0 sudo[65683]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:26 compute-0 sudo[65835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaggndhpyraatsvtluijijabuavafisi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118986.3051245-215-152202648858959/AnsiballZ_stat.py'
Jan 22 21:56:26 compute-0 sudo[65835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:26 compute-0 python3.9[65837]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:26 compute-0 sudo[65835]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:27 compute-0 sudo[65958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbmvmkkfjmksgxzefaoqettlsannblnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118986.3051245-215-152202648858959/AnsiballZ_copy.py'
Jan 22 21:56:27 compute-0 sudo[65958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:27 compute-0 python3.9[65960]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118986.3051245-215-152202648858959/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:56:27 compute-0 sudo[65958]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:27 compute-0 sudo[66110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdqewislhlsxslgeoyfesolrmxdhyqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118987.606107-302-72185028019351/AnsiballZ_file.py'
Jan 22 21:56:27 compute-0 sudo[66110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:28 compute-0 python3.9[66112]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:28 compute-0 sudo[66110]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:28 compute-0 sudo[66262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejijpxroqlykcqgiarskytpcezdfpizt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118988.378994-326-212768945522286/AnsiballZ_stat.py'
Jan 22 21:56:28 compute-0 sudo[66262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:28 compute-0 python3.9[66264]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:28 compute-0 sudo[66262]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:29 compute-0 sudo[66386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giifhvwshaflszfqjsexytdjexbkviop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118988.378994-326-212768945522286/AnsiballZ_copy.py'
Jan 22 21:56:29 compute-0 sudo[66386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:29 compute-0 python3.9[66388]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118988.378994-326-212768945522286/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:29 compute-0 sudo[66386]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:30 compute-0 sudo[66538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmqzangzbxfutbxzeabkfutjrmkchywv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118989.6948986-371-49469827692244/AnsiballZ_stat.py'
Jan 22 21:56:30 compute-0 sudo[66538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:30 compute-0 python3.9[66540]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:30 compute-0 sudo[66538]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:30 compute-0 sudo[66661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyezeillhcfmflqpalspsnurigeqbmgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118989.6948986-371-49469827692244/AnsiballZ_copy.py'
Jan 22 21:56:30 compute-0 sudo[66661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:30 compute-0 python3.9[66663]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118989.6948986-371-49469827692244/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:30 compute-0 sudo[66661]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:31 compute-0 sudo[66813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfxglxjpwizgomaoveuyuirlfyzelevc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118991.108943-416-134190597215771/AnsiballZ_systemd.py'
Jan 22 21:56:31 compute-0 sudo[66813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:32 compute-0 python3.9[66815]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:56:32 compute-0 systemd[1]: Reloading.
Jan 22 21:56:32 compute-0 systemd-rc-local-generator[66843]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:56:32 compute-0 systemd-sysv-generator[66846]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:56:32 compute-0 systemd[1]: Reloading.
Jan 22 21:56:32 compute-0 systemd-sysv-generator[66875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:56:32 compute-0 systemd-rc-local-generator[66872]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:56:32 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 22 21:56:32 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 22 21:56:32 compute-0 sudo[66813]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:33 compute-0 sudo[67042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyhqupmayfznyldbnadxhhhfolxwpcwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118992.799028-440-20449439244283/AnsiballZ_stat.py'
Jan 22 21:56:33 compute-0 sudo[67042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:33 compute-0 python3.9[67044]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:33 compute-0 sudo[67042]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:33 compute-0 sudo[67165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qstbrcpnysmhesfujxkhoghfsafdydyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118992.799028-440-20449439244283/AnsiballZ_copy.py'
Jan 22 21:56:33 compute-0 sudo[67165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:33 compute-0 python3.9[67167]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118992.799028-440-20449439244283/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:33 compute-0 sudo[67165]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:34 compute-0 sudo[67317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyhwpaujqiuoneeesyrziciegfbmrmng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118994.1757405-485-222041041040785/AnsiballZ_stat.py'
Jan 22 21:56:34 compute-0 sudo[67317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:34 compute-0 python3.9[67319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:34 compute-0 sudo[67317]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:35 compute-0 sudo[67440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrgommgkfvnccrwtqtawecpkvrjahvsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118994.1757405-485-222041041040785/AnsiballZ_copy.py'
Jan 22 21:56:35 compute-0 sudo[67440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:35 compute-0 python3.9[67442]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118994.1757405-485-222041041040785/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:35 compute-0 sudo[67440]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:35 compute-0 sudo[67592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylovvehtwzcwaerbfrvxtguygiayjony ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769118995.5805752-530-210987621822965/AnsiballZ_systemd.py'
Jan 22 21:56:35 compute-0 sudo[67592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:36 compute-0 python3.9[67594]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:56:36 compute-0 systemd[1]: Reloading.
Jan 22 21:56:36 compute-0 systemd-rc-local-generator[67624]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:56:36 compute-0 systemd-sysv-generator[67628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:56:36 compute-0 systemd[1]: Reloading.
Jan 22 21:56:36 compute-0 systemd-sysv-generator[67662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:56:36 compute-0 systemd-rc-local-generator[67659]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:56:36 compute-0 systemd[1]: Starting Create netns directory...
Jan 22 21:56:36 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 21:56:36 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 21:56:36 compute-0 systemd[1]: Finished Create netns directory.
Jan 22 21:56:36 compute-0 sudo[67592]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:37 compute-0 python3.9[67820]: ansible-ansible.builtin.service_facts Invoked
Jan 22 21:56:37 compute-0 network[67837]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 21:56:37 compute-0 network[67838]: 'network-scripts' will be removed from distribution in near future.
Jan 22 21:56:37 compute-0 network[67839]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 21:56:41 compute-0 sudo[68099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wawdegslbavnjlczpvzuzzznbhhebcur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119001.4121451-578-44444508562363/AnsiballZ_systemd.py'
Jan 22 21:56:41 compute-0 sudo[68099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:41 compute-0 python3.9[68101]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:56:42 compute-0 systemd[1]: Reloading.
Jan 22 21:56:42 compute-0 systemd-sysv-generator[68130]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:56:42 compute-0 systemd-rc-local-generator[68126]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:56:42 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 22 21:56:42 compute-0 iptables.init[68141]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 22 21:56:42 compute-0 iptables.init[68141]: iptables: Flushing firewall rules: [  OK  ]
Jan 22 21:56:42 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 22 21:56:42 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 22 21:56:42 compute-0 sudo[68099]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:43 compute-0 sudo[68335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmolwognyrfutppgyvtrtlzekvktefuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119002.9235172-578-163472288484776/AnsiballZ_systemd.py'
Jan 22 21:56:43 compute-0 sudo[68335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:43 compute-0 python3.9[68337]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:56:43 compute-0 sudo[68335]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:46 compute-0 sudo[68489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brdlldyaagpcjanwqhbowtaddhgbysjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119006.1195505-626-151637728400970/AnsiballZ_systemd.py'
Jan 22 21:56:46 compute-0 sudo[68489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:46 compute-0 python3.9[68491]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:56:46 compute-0 systemd[1]: Reloading.
Jan 22 21:56:46 compute-0 systemd-sysv-generator[68527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:56:46 compute-0 systemd-rc-local-generator[68523]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:56:47 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 22 21:56:47 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 22 21:56:47 compute-0 sudo[68489]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:47 compute-0 sudo[68682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rthumcvzoqkxcrmczprqwxrpxsfcdhsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119007.4144952-650-163399915108792/AnsiballZ_command.py'
Jan 22 21:56:47 compute-0 sudo[68682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:48 compute-0 python3.9[68684]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:56:48 compute-0 sudo[68682]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:49 compute-0 sudo[68835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtxbiwnbtrxwpzsrpwaimjmabrtcvatr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119008.707057-692-246424155547370/AnsiballZ_stat.py'
Jan 22 21:56:49 compute-0 sudo[68835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:49 compute-0 python3.9[68837]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:49 compute-0 sudo[68835]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:49 compute-0 sudo[68960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxmvhuqanhmacvlbfzgzcfwfqhlhszel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119008.707057-692-246424155547370/AnsiballZ_copy.py'
Jan 22 21:56:49 compute-0 sudo[68960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:49 compute-0 python3.9[68962]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119008.707057-692-246424155547370/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:49 compute-0 sudo[68960]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:50 compute-0 sudo[69113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmepiyaihrfyohyltyghmfmazxqixuvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119010.112263-737-95920083887753/AnsiballZ_systemd.py'
Jan 22 21:56:50 compute-0 sudo[69113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:50 compute-0 python3.9[69115]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:56:50 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 22 21:56:50 compute-0 sshd[1009]: Received SIGHUP; restarting.
Jan 22 21:56:50 compute-0 sshd[1009]: Server listening on 0.0.0.0 port 22.
Jan 22 21:56:50 compute-0 sshd[1009]: Server listening on :: port 22.
Jan 22 21:56:50 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 22 21:56:50 compute-0 sudo[69113]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:51 compute-0 sudo[69269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfnqyylzmtoamuxfsdosonkesmmtxfap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119011.1040897-761-79529192311966/AnsiballZ_file.py'
Jan 22 21:56:51 compute-0 sudo[69269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:51 compute-0 python3.9[69271]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:51 compute-0 sudo[69269]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:52 compute-0 sudo[69421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbhiebnnnachdqdbvuwxpwcoyhpsbbcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119011.8849084-785-268847144166979/AnsiballZ_stat.py'
Jan 22 21:56:52 compute-0 sudo[69421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:52 compute-0 python3.9[69423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:52 compute-0 sudo[69421]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:52 compute-0 sudo[69544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpjrnorngyuoumtvpxkfequpvfxcvpvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119011.8849084-785-268847144166979/AnsiballZ_copy.py'
Jan 22 21:56:52 compute-0 sudo[69544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:53 compute-0 python3.9[69546]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119011.8849084-785-268847144166979/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:53 compute-0 sudo[69544]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:53 compute-0 sudo[69696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lheslqmntasprnyoqnjoqbmnuxoylrgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119013.4626174-839-249327384696893/AnsiballZ_timezone.py'
Jan 22 21:56:53 compute-0 sudo[69696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:54 compute-0 python3.9[69698]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 21:56:54 compute-0 systemd[1]: Starting Time & Date Service...
Jan 22 21:56:54 compute-0 systemd[1]: Started Time & Date Service.
Jan 22 21:56:54 compute-0 sudo[69696]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:55 compute-0 sudo[69852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-danqpeyisthztrzldkiqxvfslioqrlfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119014.5979776-866-44935452932230/AnsiballZ_file.py'
Jan 22 21:56:55 compute-0 sudo[69852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:55 compute-0 python3.9[69854]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:55 compute-0 sudo[69852]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:55 compute-0 sudo[70004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tstdmptslfkjuokvsfkofiaoekxbsusy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119015.5178905-890-4199612962531/AnsiballZ_stat.py'
Jan 22 21:56:55 compute-0 sudo[70004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:56 compute-0 python3.9[70006]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:56 compute-0 sudo[70004]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:56 compute-0 sudo[70127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyxzqszewpvlwuzbzkwliegmslamrhlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119015.5178905-890-4199612962531/AnsiballZ_copy.py'
Jan 22 21:56:56 compute-0 sudo[70127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:56 compute-0 python3.9[70129]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119015.5178905-890-4199612962531/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:56 compute-0 sudo[70127]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:57 compute-0 sudo[70279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewqzizxdepwqipbiqhsnhbtrgubhjslo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119016.774013-935-248086477277396/AnsiballZ_stat.py'
Jan 22 21:56:57 compute-0 sudo[70279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:57 compute-0 python3.9[70281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:57 compute-0 sudo[70279]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:57 compute-0 sudo[70402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywcvbausypulgurjhymzfaaptiadchiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119016.774013-935-248086477277396/AnsiballZ_copy.py'
Jan 22 21:56:57 compute-0 sudo[70402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:57 compute-0 python3.9[70404]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119016.774013-935-248086477277396/.source.yaml _original_basename=.tycqhedl follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:58 compute-0 sudo[70402]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:58 compute-0 sudo[70554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgyntkeodcxdkrpewaozbdksysontjok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119018.2140474-980-104830275251616/AnsiballZ_stat.py'
Jan 22 21:56:58 compute-0 sudo[70554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:58 compute-0 python3.9[70556]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:56:58 compute-0 sudo[70554]: pam_unix(sudo:session): session closed for user root
Jan 22 21:56:59 compute-0 sudo[70677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfltbpdcyjligxenyydgrklnjnakczfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119018.2140474-980-104830275251616/AnsiballZ_copy.py'
Jan 22 21:56:59 compute-0 sudo[70677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:56:59 compute-0 python3.9[70679]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119018.2140474-980-104830275251616/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:56:59 compute-0 sudo[70677]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:00 compute-0 sudo[70829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yipkplyvnhvhyihdifbnmmhrzrdvubzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119019.841341-1025-135002062321533/AnsiballZ_command.py'
Jan 22 21:57:00 compute-0 sudo[70829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:00 compute-0 python3.9[70831]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:57:00 compute-0 sudo[70829]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:01 compute-0 sudo[70982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdsusslkkaxxlkhxryjtdetjntnxwojn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119020.876359-1049-155886463501647/AnsiballZ_command.py'
Jan 22 21:57:01 compute-0 sudo[70982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:01 compute-0 python3.9[70984]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:57:01 compute-0 sudo[70982]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:02 compute-0 sudo[71135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysbhhgaojhcmfojwwsrjdpggenqyoiha ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119021.7297392-1073-185104273833655/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 21:57:02 compute-0 sudo[71135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:02 compute-0 python3[71137]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 21:57:02 compute-0 sudo[71135]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:03 compute-0 sudo[71287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkkkkobccyyhdzkshnknebmhvnzixcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119022.7376375-1097-67482641233355/AnsiballZ_stat.py'
Jan 22 21:57:03 compute-0 sudo[71287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:03 compute-0 python3.9[71289]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:57:03 compute-0 sudo[71287]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:03 compute-0 sudo[71410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyopwkyazeqvwvarzbgytjyybpzvrcpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119022.7376375-1097-67482641233355/AnsiballZ_copy.py'
Jan 22 21:57:03 compute-0 sudo[71410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:04 compute-0 python3.9[71412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119022.7376375-1097-67482641233355/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:04 compute-0 sudo[71410]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:05 compute-0 sudo[71562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlfumgyanofojsosbuzgpervwyijnkdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119024.4290307-1142-150581739574416/AnsiballZ_stat.py'
Jan 22 21:57:05 compute-0 sudo[71562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:05 compute-0 python3.9[71564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:57:05 compute-0 sudo[71562]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:06 compute-0 sudo[71685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnuwpenxuehlkceiwxntznpbabslexle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119024.4290307-1142-150581739574416/AnsiballZ_copy.py'
Jan 22 21:57:06 compute-0 sudo[71685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:06 compute-0 python3.9[71687]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119024.4290307-1142-150581739574416/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:06 compute-0 sudo[71685]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:06 compute-0 sudo[71837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knmzawmxiewgciqozlqltwngarqynyby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119026.5338326-1187-84073082437339/AnsiballZ_stat.py'
Jan 22 21:57:06 compute-0 sudo[71837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:07 compute-0 python3.9[71839]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:57:07 compute-0 sudo[71837]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:07 compute-0 sudo[71960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxaielirzupwcwimxtzdjwpszhjnhxkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119026.5338326-1187-84073082437339/AnsiballZ_copy.py'
Jan 22 21:57:07 compute-0 sudo[71960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:07 compute-0 python3.9[71962]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119026.5338326-1187-84073082437339/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:07 compute-0 sudo[71960]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:08 compute-0 sudo[72112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzpbrbjoyqkptripotukrtxjpbfyeqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119028.0614285-1232-56279533832998/AnsiballZ_stat.py'
Jan 22 21:57:08 compute-0 sudo[72112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:08 compute-0 python3.9[72114]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:57:08 compute-0 sudo[72112]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:09 compute-0 sudo[72235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfimrowjhimqxtlpoujhvlivuufvcfgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119028.0614285-1232-56279533832998/AnsiballZ_copy.py'
Jan 22 21:57:09 compute-0 sudo[72235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:09 compute-0 python3.9[72237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119028.0614285-1232-56279533832998/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:09 compute-0 sudo[72235]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:10 compute-0 sudo[72387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jciallorlyvomjehypudfceshszsgire ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119029.923359-1277-26191907020625/AnsiballZ_stat.py'
Jan 22 21:57:10 compute-0 sudo[72387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:10 compute-0 python3.9[72389]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:57:10 compute-0 sudo[72387]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:11 compute-0 sudo[72510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhzasqtepiehludzkkmqffwzqrgymkxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119029.923359-1277-26191907020625/AnsiballZ_copy.py'
Jan 22 21:57:11 compute-0 sudo[72510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:11 compute-0 python3.9[72512]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119029.923359-1277-26191907020625/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:11 compute-0 sudo[72510]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:12 compute-0 sudo[72662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjorxzqbqaijdjkfocokckjlonwlfsmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119031.7819405-1322-236149866563913/AnsiballZ_file.py'
Jan 22 21:57:12 compute-0 sudo[72662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:12 compute-0 python3.9[72664]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:12 compute-0 sudo[72662]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:13 compute-0 sudo[72814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krwodusvfuzwpvdimiotsetmsumuqhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119032.604485-1346-161153204394347/AnsiballZ_command.py'
Jan 22 21:57:13 compute-0 sudo[72814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:13 compute-0 python3.9[72816]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:57:13 compute-0 sudo[72814]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:14 compute-0 sudo[72973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olgnrgbrfnzoetrzqwulslyqozorjrtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119033.5729988-1370-274485959563300/AnsiballZ_blockinfile.py'
Jan 22 21:57:14 compute-0 sudo[72973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:14 compute-0 python3.9[72975]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:14 compute-0 sudo[72973]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:15 compute-0 sudo[73126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttcwqrdrdlvbdiupylpgemjwgckbwpyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119034.7803938-1397-23310042598434/AnsiballZ_file.py'
Jan 22 21:57:15 compute-0 sudo[73126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:15 compute-0 python3.9[73128]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:15 compute-0 sudo[73126]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:15 compute-0 sudo[73278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqneirxnebbiieddtejxdqxdxgygbgxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119035.6116083-1397-94807209936777/AnsiballZ_file.py'
Jan 22 21:57:15 compute-0 sudo[73278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:16 compute-0 python3.9[73280]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:16 compute-0 sudo[73278]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:16 compute-0 sudo[73430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueqtlwkpewlsdzbrvtpnkiejyzpwmzqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119036.4228914-1442-20825893762054/AnsiballZ_mount.py'
Jan 22 21:57:16 compute-0 sudo[73430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:17 compute-0 python3.9[73432]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 21:57:17 compute-0 sudo[73430]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:17 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 21:57:17 compute-0 sudo[73584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygaibkcerdyadyvucgqdwpyenmbsemow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119037.4677622-1442-64179621228224/AnsiballZ_mount.py'
Jan 22 21:57:17 compute-0 sudo[73584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:18 compute-0 python3.9[73586]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 21:57:18 compute-0 sudo[73584]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:18 compute-0 sshd-session[64424]: Connection closed by 192.168.122.30 port 36982
Jan 22 21:57:18 compute-0 sshd-session[64421]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:57:18 compute-0 systemd-logind[801]: Session 14 logged out. Waiting for processes to exit.
Jan 22 21:57:18 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 22 21:57:18 compute-0 systemd[1]: session-14.scope: Consumed 41.741s CPU time.
Jan 22 21:57:18 compute-0 systemd-logind[801]: Removed session 14.
Jan 22 21:57:24 compute-0 sshd-session[73612]: Accepted publickey for zuul from 192.168.122.30 port 59818 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:57:24 compute-0 systemd-logind[801]: New session 15 of user zuul.
Jan 22 21:57:24 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 22 21:57:24 compute-0 sshd-session[73612]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:57:24 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 21:57:25 compute-0 sudo[73767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykjrtzfcigqbxahmouiyjkizcbexbmor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119044.2737641-23-216476620912437/AnsiballZ_tempfile.py'
Jan 22 21:57:25 compute-0 sudo[73767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:25 compute-0 python3.9[73769]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 22 21:57:25 compute-0 sudo[73767]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:26 compute-0 sudo[73919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhwpzbxawubnvinnrglpaovwafgdkqla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119045.6312702-59-212835729376394/AnsiballZ_stat.py'
Jan 22 21:57:26 compute-0 sudo[73919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:26 compute-0 python3.9[73921]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:57:26 compute-0 sudo[73919]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:27 compute-0 sudo[74071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kydgacedgtezcllwzsfuzztjdrjclwue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119046.691414-89-242120003212331/AnsiballZ_setup.py'
Jan 22 21:57:27 compute-0 sudo[74071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:27 compute-0 python3.9[74073]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:57:27 compute-0 sudo[74071]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:28 compute-0 sudo[74223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmihuubureccdzjbyuinzyaphzosqaei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119048.0184946-114-7836272272073/AnsiballZ_blockinfile.py'
Jan 22 21:57:28 compute-0 sudo[74223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:28 compute-0 python3.9[74225]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDsQ8HxD+acUzMW7y+ojQrFoQhnipuOmPD+pN/LYf0PEMFLjVKFPk/irRo0yydMpfEmQS+fgyBwWIpMFiyItLzghXwdJomlM4+mfL5fh7wMd1luI2YkbAn9gP2J+aUOws2xnHMA+KqDI+DlP0vFrjD5oW0pLyYX1GmHUf8iPZp1hTGPIW7hvNbWS08PbO72YvBEyGnlbvM1g5GAHYC34J/SeByxzhXxcCUBlapFnSLYSO5Iz9f9NKuEam0TX9/0fXYuBUNGGGXZCd1f3Y/de7PUYwWq9+Y4lCBYI1htjd2KgGaXTvvoAIPrbepVAGOGUgR74MjS/EPM/yfHTtAXmmqcX56DGdo5lDlCyEYaU6RwPKZ3KoZ6P+f6mh+HdgOtPNJSNPrqx+1MPZxv8YglYhGlsKP4wunP9+YobVw2L1OkQj15Ve+mq4oaTfVK11Whafo8emmZaaVWdSf/vTySrQKlGH4xuARGkl7OX5QTgl6VeucPYGRIMvNeHiXRus6No+8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKSmkRoT0KNQJUoAcRNTJPaE3FQRYD/wAwUPLYSAUsp8
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOQpt25ukwK1zMNKsuSjA5T22/WVXFcayjnSTcFTbBDbYSzIvs7g7A8Uz5saIem3Nj3Z5ICbv3FkmJqNu5uGxSk=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMbqwZEYmgt4Vcgbv8u1Qt0lSNuSQB7H7j+Hl0xxZN4wkQtUIqdNOgGPMkHdeZXa0K65YhBP/+BcoETX6wD2q3UkJoRu5TCGI1SS+fFxru12xsnSzM7EpUqhYXjuj4iWDJwwXoKGvfWE4koZjdTzrpzaqTV35a6nyZ6W002lmFO3uIJpTbEX6LP+4LbKZDPRXHKgaPn+xeLkkQi/0/umRGuyzMWSc5vwOEqEgW2U7OHNlJKz7d/oB2v2A4lqecLCgqePXrngV2s4CGw0YBB6MOMtVJFTyH/DFKn3OV8pvcIUHJ1K+4ehR1J0ekr3OXcMxuqYTvLlF7oryPsV5d+AK7upNSRAGBb1TkrnQwzf3MacMNCLrerixTpO3AaKxgQwA3oDl3ZaVwRkDqga4B77+WIRtnEC9AHyxH4aIn4G8phcNinp0/Dzt2iv7fQ24qdMcBlEmnOMolXjn9P9PBl/dflKFLhViFeZcpm8v0cRoWtNM0oM9ulE/YmMMdQ6p9Qp0=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOiXEreQbFNQXqTZYKwUY4lV5Q0Vn0xENHFGNNlfTBDN
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA88I7/5bkNrgFopiUfaNLL8soo268+yUoqVyiPopOEwtu6gxG5LSpaOBKcxiBaWS9dz2ydzWt+C5uaJi/r5z8s=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDYovLvWwVne9VsvTNCgmjav87jPUq/BRGKpdlwbBf5Ohf7q125s9EPd5z3jIwwmHFUYAtMZtk+wIC2FRYl56hAP7Wcc/xtJ+NZI8TsTuxchJYYfTsj0hgtMoIPz2KWqpFjbD/tGOhsqb14AxKbv3k+hH0wPGHbtB2RACT9owrJILTRRspSJsRRQsJI+KTSJ8rBRpSxkf8A7v/WOja7BcSQ8G8IuxC3RoVuocuw8/kJL/fhOaIpffzMmHR5bJSVxF1dGxgQOsBAddXZYPiQQcO7dzLlX8JJPwYyDZrCg7cGozd6AteSnPm0jphfRKNpKMEijcBHraRq2KIZu4ofBpO4jzGC1PmR60WPU7Zw59GX9Xip+6xLS06IUyIGHvj6oxvEa9NA6VhiZg7r+G1VUFo4auW4OPt+Fm2IVsaK4SwLRhlNyKOODJRENhFfYRMF+ERwyRTI030r8cuRHhYbeVOh279mpmrcU7r4Uo4V0OaBEnV9bot8fgZLWiELNo73538=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJp6xfeQzlhFSExtmr7lG0P1q/tf7XlRWYTeildkjaJT
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOltuaM3uIWGlrjiBwDU+eji0GOtYxIzYKGqfDQ/lMhMoMQkkyf0jLeN+5sZX7/5cWlTGwRhVmmyEOkGXf7OKgI=
                                             create=True mode=0644 path=/tmp/ansible.vhzgjkr0 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:28 compute-0 sudo[74223]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:29 compute-0 sudo[74375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrxjmdehhuwrmjbykjccdcjzjmuzolcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119048.9624925-138-32642006991092/AnsiballZ_command.py'
Jan 22 21:57:29 compute-0 sudo[74375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:29 compute-0 python3.9[74377]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.vhzgjkr0' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:57:29 compute-0 sudo[74375]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:30 compute-0 sudo[74529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grjregxevfxcjhqnxzgngmmondvqxigf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119050.11329-162-105007177379122/AnsiballZ_file.py'
Jan 22 21:57:30 compute-0 sudo[74529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:30 compute-0 python3.9[74531]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.vhzgjkr0 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:30 compute-0 sudo[74529]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:31 compute-0 sshd-session[73615]: Connection closed by 192.168.122.30 port 59818
Jan 22 21:57:31 compute-0 sshd-session[73612]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:57:31 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 22 21:57:31 compute-0 systemd[1]: session-15.scope: Consumed 4.578s CPU time.
Jan 22 21:57:31 compute-0 systemd-logind[801]: Session 15 logged out. Waiting for processes to exit.
Jan 22 21:57:31 compute-0 systemd-logind[801]: Removed session 15.
Jan 22 21:57:36 compute-0 sshd-session[74556]: Accepted publickey for zuul from 192.168.122.30 port 56518 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:57:36 compute-0 systemd-logind[801]: New session 16 of user zuul.
Jan 22 21:57:36 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 22 21:57:36 compute-0 sshd-session[74556]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:57:37 compute-0 python3.9[74709]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:57:39 compute-0 sudo[74863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjvgsnsloukugblavatsewyhxypiglib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119058.1443832-56-221845084040509/AnsiballZ_systemd.py'
Jan 22 21:57:39 compute-0 sudo[74863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:39 compute-0 python3.9[74865]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 21:57:39 compute-0 sudo[74863]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:40 compute-0 sudo[75017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqtjvzpgobtcwmbjjnypkuwccmcfcxgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119059.769713-80-223297493478876/AnsiballZ_systemd.py'
Jan 22 21:57:40 compute-0 sudo[75017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:40 compute-0 python3.9[75019]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 21:57:40 compute-0 sudo[75017]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:41 compute-0 sudo[75170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbijijuqztdnjasbbfofftxnpppagafm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119060.9621718-107-105913354421513/AnsiballZ_command.py'
Jan 22 21:57:41 compute-0 sudo[75170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:41 compute-0 python3.9[75172]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:57:41 compute-0 sudo[75170]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:42 compute-0 sudo[75323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsbvncdqxijptajtewqrmxbnvaldatof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119061.9490666-131-170259431162525/AnsiballZ_stat.py'
Jan 22 21:57:42 compute-0 sudo[75323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:42 compute-0 python3.9[75325]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:57:42 compute-0 sudo[75323]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:43 compute-0 sudo[75477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okyqfvrbdpsseiprpvivxqaegddkybbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119062.9232965-155-261296236837761/AnsiballZ_command.py'
Jan 22 21:57:43 compute-0 sudo[75477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:43 compute-0 python3.9[75479]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:57:43 compute-0 sudo[75477]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:44 compute-0 sudo[75632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogesjncnnhpoetfzdwtxqdhnxwmwjdab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119063.74915-179-241353238193843/AnsiballZ_file.py'
Jan 22 21:57:44 compute-0 sudo[75632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:44 compute-0 python3.9[75634]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:57:44 compute-0 sudo[75632]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:44 compute-0 sshd-session[74559]: Connection closed by 192.168.122.30 port 56518
Jan 22 21:57:44 compute-0 sshd-session[74556]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:57:44 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 22 21:57:44 compute-0 systemd[1]: session-16.scope: Consumed 5.580s CPU time.
Jan 22 21:57:44 compute-0 systemd-logind[801]: Session 16 logged out. Waiting for processes to exit.
Jan 22 21:57:44 compute-0 systemd-logind[801]: Removed session 16.
Jan 22 21:57:50 compute-0 sshd-session[75659]: Accepted publickey for zuul from 192.168.122.30 port 54976 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:57:50 compute-0 systemd-logind[801]: New session 17 of user zuul.
Jan 22 21:57:50 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 22 21:57:50 compute-0 sshd-session[75659]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:57:51 compute-0 python3.9[75812]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:57:52 compute-0 sudo[75966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukqstuhgwrxxkmrmijqyrhkxfewpwlmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119072.0622923-62-136736002937639/AnsiballZ_setup.py'
Jan 22 21:57:52 compute-0 sudo[75966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:52 compute-0 python3.9[75968]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:57:52 compute-0 sudo[75966]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:53 compute-0 sudo[76050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndmtifylhwwupfprthunqkrizcrgtdzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119072.0622923-62-136736002937639/AnsiballZ_dnf.py'
Jan 22 21:57:53 compute-0 sudo[76050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:57:53 compute-0 python3.9[76052]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 21:57:54 compute-0 sudo[76050]: pam_unix(sudo:session): session closed for user root
Jan 22 21:57:55 compute-0 python3.9[76203]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:57:57 compute-0 python3.9[76354]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 21:57:58 compute-0 python3.9[76504]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:57:58 compute-0 python3.9[76654]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:57:59 compute-0 sshd-session[75662]: Connection closed by 192.168.122.30 port 54976
Jan 22 21:57:59 compute-0 sshd-session[75659]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:57:59 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 22 21:57:59 compute-0 systemd-logind[801]: Session 17 logged out. Waiting for processes to exit.
Jan 22 21:57:59 compute-0 systemd[1]: session-17.scope: Consumed 6.661s CPU time.
Jan 22 21:57:59 compute-0 systemd-logind[801]: Removed session 17.
Jan 22 21:58:04 compute-0 sshd-session[76679]: Accepted publickey for zuul from 192.168.122.30 port 55040 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:58:04 compute-0 systemd-logind[801]: New session 18 of user zuul.
Jan 22 21:58:04 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 22 21:58:04 compute-0 sshd-session[76679]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:58:05 compute-0 python3.9[76832]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:58:07 compute-0 sudo[76986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ambdzrvjygarcwjcielhhyfbzbroudxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119086.5860012-111-200223996621219/AnsiballZ_file.py'
Jan 22 21:58:07 compute-0 sudo[76986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:07 compute-0 python3.9[76988]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:07 compute-0 sudo[76986]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:07 compute-0 sudo[77139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgubhtgqfwhohgeoqlvjccdzevkrxqog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119087.5058112-111-63654825316905/AnsiballZ_file.py'
Jan 22 21:58:07 compute-0 sudo[77139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:08 compute-0 python3.9[77141]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:08 compute-0 sudo[77139]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:08 compute-0 sudo[77291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snlypezvhokloxrmhvedoizrxaexqcxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119088.364683-160-258709375012033/AnsiballZ_stat.py'
Jan 22 21:58:08 compute-0 sudo[77291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:08 compute-0 python3.9[77293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:08 compute-0 sudo[77291]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:09 compute-0 sudo[77414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmfbuntsrzxcvnehvddrfswxffbhcqve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119088.364683-160-258709375012033/AnsiballZ_copy.py'
Jan 22 21:58:09 compute-0 sudo[77414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:09 compute-0 python3.9[77416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119088.364683-160-258709375012033/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d627f02796949c4c41a2f176914e518bc6e3b6af backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:09 compute-0 sudo[77414]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:10 compute-0 sudo[77566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhfbqmaxgczojyodaocfnfcooyarhvoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119089.7404215-160-198015709727867/AnsiballZ_stat.py'
Jan 22 21:58:10 compute-0 sudo[77566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:10 compute-0 python3.9[77568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:10 compute-0 sudo[77566]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:10 compute-0 sudo[77689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owcrlwlifbwuvrfrnjjvrjdhrgepbipa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119089.7404215-160-198015709727867/AnsiballZ_copy.py'
Jan 22 21:58:10 compute-0 sudo[77689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:10 compute-0 python3.9[77691]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119089.7404215-160-198015709727867/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=db65c959f56a6e9a98b3ef4e3f4f055ca1563e1d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:11 compute-0 sudo[77689]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:11 compute-0 sudo[77841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqqdhmlskssbchbcxcwifudpnpfloaxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119091.202836-160-266204437909559/AnsiballZ_stat.py'
Jan 22 21:58:11 compute-0 sudo[77841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:11 compute-0 python3.9[77843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:11 compute-0 sudo[77841]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:12 compute-0 sudo[77964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkkobpnwdlmqwgrmdxpkcxibwuwakbop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119091.202836-160-266204437909559/AnsiballZ_copy.py'
Jan 22 21:58:12 compute-0 sudo[77964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:12 compute-0 python3.9[77966]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119091.202836-160-266204437909559/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=67c9f5687ff225fc5aec456c3a427a298b24f625 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:12 compute-0 sudo[77964]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:13 compute-0 sudo[78116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psnxotdjpogkmleqcyroncacrfqiuooj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119092.6704235-302-276718843716450/AnsiballZ_file.py'
Jan 22 21:58:13 compute-0 sudo[78116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:13 compute-0 python3.9[78118]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:13 compute-0 sudo[78116]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:13 compute-0 sudo[78268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pizwadkwtcixtnqtxuliyrnytxidxiwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119093.4654002-302-165398698559692/AnsiballZ_file.py'
Jan 22 21:58:13 compute-0 sudo[78268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:14 compute-0 python3.9[78270]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:14 compute-0 sudo[78268]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:14 compute-0 sudo[78420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgtedjppwegytqtesjtwzsimiumcexeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119094.2873409-353-110880488821545/AnsiballZ_stat.py'
Jan 22 21:58:14 compute-0 sudo[78420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:14 compute-0 python3.9[78422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:14 compute-0 sudo[78420]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:15 compute-0 sudo[78543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uubwejeosyzkmljnmbnitrshncehybkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119094.2873409-353-110880488821545/AnsiballZ_copy.py'
Jan 22 21:58:15 compute-0 sudo[78543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:15 compute-0 python3.9[78545]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119094.2873409-353-110880488821545/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=37e07543d8f80669abff0920fe3fe3c37bb69e86 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:15 compute-0 sudo[78543]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:16 compute-0 sudo[78695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdrfgrtlegptepclquzwwcjnflfuuelh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119095.6856253-353-281188214527545/AnsiballZ_stat.py'
Jan 22 21:58:16 compute-0 sudo[78695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:16 compute-0 python3.9[78697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:16 compute-0 sudo[78695]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:16 compute-0 sudo[78818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgvuzklpnwpbmkolgoynnebocaoknkru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119095.6856253-353-281188214527545/AnsiballZ_copy.py'
Jan 22 21:58:16 compute-0 sudo[78818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:16 compute-0 python3.9[78820]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119095.6856253-353-281188214527545/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8e99c559f7307e2be5911618292e726c1ea8db3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:16 compute-0 sudo[78818]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:17 compute-0 sudo[78970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkzkabenopdgyrqtxbyuoezkvefjnyon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119097.1090288-353-136321073313118/AnsiballZ_stat.py'
Jan 22 21:58:17 compute-0 sudo[78970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:17 compute-0 python3.9[78972]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:17 compute-0 sudo[78970]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:18 compute-0 sudo[79093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qixzddyhfccnzcxsamcbnudslpzfkkwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119097.1090288-353-136321073313118/AnsiballZ_copy.py'
Jan 22 21:58:18 compute-0 sudo[79093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:18 compute-0 python3.9[79095]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119097.1090288-353-136321073313118/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=552a4c317f45ef6755aa85c25261148108b547fc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:18 compute-0 sudo[79093]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:18 compute-0 sudo[79245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvvwfpsrzqnkdcmodtbcxximaucgxour ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119098.5366023-493-169495036695710/AnsiballZ_file.py'
Jan 22 21:58:18 compute-0 sudo[79245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:19 compute-0 python3.9[79247]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:19 compute-0 sudo[79245]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:19 compute-0 sudo[79397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwookbepbklqmgesvlvdrhmbtuxtguls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119099.3126473-493-15842053212725/AnsiballZ_file.py'
Jan 22 21:58:19 compute-0 sudo[79397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:19 compute-0 python3.9[79399]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:19 compute-0 sudo[79397]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:20 compute-0 sudo[79549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbwxelgrendwffjsxkbpeewgzdckgoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119100.1043167-545-54569779470540/AnsiballZ_stat.py'
Jan 22 21:58:20 compute-0 sudo[79549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:20 compute-0 python3.9[79551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:20 compute-0 sudo[79549]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:21 compute-0 sudo[79672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpeeucxpaqxwxgrvtyduvxevuswekzjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119100.1043167-545-54569779470540/AnsiballZ_copy.py'
Jan 22 21:58:21 compute-0 sudo[79672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:21 compute-0 python3.9[79674]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119100.1043167-545-54569779470540/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e88b63313168db0b612cb7274da8a5c4f74eb38b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:21 compute-0 sudo[79672]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:21 compute-0 chronyd[64395]: Selected source 149.56.19.163 (pool.ntp.org)
Jan 22 21:58:21 compute-0 sudo[79824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fosjwrskdjuqgvcuxorpdirhaixjcewd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119101.552286-545-121688334145932/AnsiballZ_stat.py'
Jan 22 21:58:21 compute-0 sudo[79824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:22 compute-0 python3.9[79826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:22 compute-0 sudo[79824]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:22 compute-0 sudo[79947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cslgapohebvqlbcmyvjwrvjnozfazlyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119101.552286-545-121688334145932/AnsiballZ_copy.py'
Jan 22 21:58:22 compute-0 sudo[79947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:22 compute-0 python3.9[79949]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119101.552286-545-121688334145932/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c08e8d5da31f6b1cbcb93a7e2f1ee2223d2c1be3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:22 compute-0 sudo[79947]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:23 compute-0 sudo[80099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzzyhfjciytnioqgvoczpmrsdztiaird ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119102.9600704-545-147353465678509/AnsiballZ_stat.py'
Jan 22 21:58:23 compute-0 sudo[80099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:23 compute-0 python3.9[80101]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:23 compute-0 sudo[80099]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:23 compute-0 sudo[80222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifilsitefqhgsicvwddeqqvemquxfycg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119102.9600704-545-147353465678509/AnsiballZ_copy.py'
Jan 22 21:58:23 compute-0 sudo[80222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:24 compute-0 python3.9[80224]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119102.9600704-545-147353465678509/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=8b9c4d9748bcb155c70df47dd5322f2fa67f8e15 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:24 compute-0 sudo[80222]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:24 compute-0 sudo[80374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vakrwkghwrvzluogpyvwdnbahzhrijdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119104.4986684-691-244779371252310/AnsiballZ_file.py'
Jan 22 21:58:24 compute-0 sudo[80374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:25 compute-0 python3.9[80376]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:25 compute-0 sudo[80374]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:25 compute-0 sudo[80526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkobineheiigzbazgroxhuijkvpkryln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119105.2701862-691-115027189372003/AnsiballZ_file.py'
Jan 22 21:58:25 compute-0 sudo[80526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:25 compute-0 python3.9[80528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:25 compute-0 sudo[80526]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:26 compute-0 sudo[80678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmoeounbfjolxjenamrcntqrmhcmwzjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119106.0486243-742-230692183630675/AnsiballZ_stat.py'
Jan 22 21:58:26 compute-0 sudo[80678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:26 compute-0 python3.9[80680]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:26 compute-0 sudo[80678]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:27 compute-0 sudo[80801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yupjxwtovunultezrhsokmxpjtgxobjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119106.0486243-742-230692183630675/AnsiballZ_copy.py'
Jan 22 21:58:27 compute-0 sudo[80801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:27 compute-0 python3.9[80803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119106.0486243-742-230692183630675/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=11252348b52058b6969f429dcee0a20f6d7e1cda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:27 compute-0 sudo[80801]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:27 compute-0 sudo[80953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czmuoqdnkphnvayvopfkzwbcmmesksjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119107.574376-742-126220219368868/AnsiballZ_stat.py'
Jan 22 21:58:27 compute-0 sudo[80953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:28 compute-0 python3.9[80955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:28 compute-0 sudo[80953]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:28 compute-0 sudo[81076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgtvdoftztgigjixfpmiczycfjicmjjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119107.574376-742-126220219368868/AnsiballZ_copy.py'
Jan 22 21:58:28 compute-0 sudo[81076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:28 compute-0 python3.9[81078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119107.574376-742-126220219368868/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c08e8d5da31f6b1cbcb93a7e2f1ee2223d2c1be3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:28 compute-0 sudo[81076]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:29 compute-0 sudo[81228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuohlfflnntxpjencspzzmlwpnvdzedo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119108.9550166-742-20320479691059/AnsiballZ_stat.py'
Jan 22 21:58:29 compute-0 sudo[81228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:29 compute-0 python3.9[81230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:29 compute-0 sudo[81228]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:29 compute-0 sudo[81351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxbzjtequfdemmdwrkgoprslxxpiqsdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119108.9550166-742-20320479691059/AnsiballZ_copy.py'
Jan 22 21:58:29 compute-0 sudo[81351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:30 compute-0 python3.9[81353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119108.9550166-742-20320479691059/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=1ab8430a1e069c0ba8d0eb45250d3591350e8557 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:30 compute-0 sudo[81351]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:31 compute-0 sudo[81503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikydupmaxrnacouvkiyyinztwtzifypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119110.9887044-933-142528525403341/AnsiballZ_file.py'
Jan 22 21:58:31 compute-0 sudo[81503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:31 compute-0 python3.9[81505]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:31 compute-0 sudo[81503]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:32 compute-0 sudo[81655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvkhkjokjmemeqsndmuqodcirdhadyos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119111.7741406-959-135622160087478/AnsiballZ_stat.py'
Jan 22 21:58:32 compute-0 sudo[81655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:32 compute-0 python3.9[81657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:32 compute-0 sudo[81655]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:32 compute-0 sudo[81778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stdvykqupeufebqlrgmumdpfvfdqvkmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119111.7741406-959-135622160087478/AnsiballZ_copy.py'
Jan 22 21:58:32 compute-0 sudo[81778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:33 compute-0 python3.9[81780]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119111.7741406-959-135622160087478/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:33 compute-0 sudo[81778]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:33 compute-0 sudo[81930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkbkhpybcsbjxzejcagrfhawkmpwyttz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119113.2692502-1010-83333693039541/AnsiballZ_file.py'
Jan 22 21:58:33 compute-0 sudo[81930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:33 compute-0 python3.9[81932]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:33 compute-0 sudo[81930]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:34 compute-0 sudo[82082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idobizomaneogdimefwqvfbekqcrjbku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119114.079966-1039-78890268037795/AnsiballZ_stat.py'
Jan 22 21:58:34 compute-0 sudo[82082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:34 compute-0 python3.9[82084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:34 compute-0 sudo[82082]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:35 compute-0 sudo[82205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcziturralmoivlkiffepoalcivjnthq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119114.079966-1039-78890268037795/AnsiballZ_copy.py'
Jan 22 21:58:35 compute-0 sudo[82205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:35 compute-0 python3.9[82207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119114.079966-1039-78890268037795/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:35 compute-0 sudo[82205]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:35 compute-0 sudo[82357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csrjcfqerykxhmqoylulimhcwqlchoao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119115.5366957-1087-117232113859519/AnsiballZ_file.py'
Jan 22 21:58:35 compute-0 sudo[82357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:36 compute-0 python3.9[82359]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:36 compute-0 sudo[82357]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:36 compute-0 sudo[82509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tooyqhrofxqnfavksyyvgpikmifklans ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119116.3792787-1115-70650379467583/AnsiballZ_stat.py'
Jan 22 21:58:36 compute-0 sudo[82509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:36 compute-0 python3.9[82511]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:36 compute-0 sudo[82509]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:37 compute-0 sudo[82632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqvhkhrpvpgjgtrqhcwnhownxtiqqeuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119116.3792787-1115-70650379467583/AnsiballZ_copy.py'
Jan 22 21:58:37 compute-0 sudo[82632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:37 compute-0 python3.9[82634]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119116.3792787-1115-70650379467583/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:37 compute-0 sudo[82632]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:38 compute-0 sudo[82784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eytawzocdazwmwfnerwlvplgtzstxqpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119117.8615983-1167-15862726884958/AnsiballZ_file.py'
Jan 22 21:58:38 compute-0 sudo[82784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:38 compute-0 python3.9[82786]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:38 compute-0 sudo[82784]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:39 compute-0 sudo[82937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrbnuwrwowvgtbxisecqulxburvkwkhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119119.295303-1191-120331032282576/AnsiballZ_stat.py'
Jan 22 21:58:39 compute-0 sudo[82937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:39 compute-0 python3.9[82939]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:39 compute-0 sudo[82937]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:40 compute-0 sudo[83060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydrpmshlmpeqeqfrhyfvgzlqurkrxmix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119119.295303-1191-120331032282576/AnsiballZ_copy.py'
Jan 22 21:58:40 compute-0 sudo[83060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:40 compute-0 python3.9[83062]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119119.295303-1191-120331032282576/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:40 compute-0 sudo[83060]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:41 compute-0 sudo[83212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhsiadqhuahywrdaqmlbrpblyvjphkum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119120.7962227-1256-94658672929283/AnsiballZ_file.py'
Jan 22 21:58:41 compute-0 sudo[83212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:41 compute-0 python3.9[83214]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:41 compute-0 sudo[83212]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:41 compute-0 sudo[83364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agnnjqbjgmbrxkjbuwhedjwdciecyulr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119121.5864432-1272-72803043654299/AnsiballZ_stat.py'
Jan 22 21:58:41 compute-0 sudo[83364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:42 compute-0 python3.9[83366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:42 compute-0 sudo[83364]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:42 compute-0 sudo[83487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmlfdyvrdiwwmsxvujbkrbiwjxycablt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119121.5864432-1272-72803043654299/AnsiballZ_copy.py'
Jan 22 21:58:42 compute-0 sudo[83487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:42 compute-0 python3.9[83489]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119121.5864432-1272-72803043654299/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:42 compute-0 sudo[83487]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:43 compute-0 sudo[83639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xncdzcbotgfqnexvnpcavilaeitrvurp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119123.1070845-1307-52107487092441/AnsiballZ_file.py'
Jan 22 21:58:43 compute-0 sudo[83639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:43 compute-0 python3.9[83641]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:43 compute-0 sudo[83639]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:44 compute-0 sudo[83791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etnuhaikeuktfeyzuqcmaryweymiadci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119123.8572168-1323-243121573769703/AnsiballZ_stat.py'
Jan 22 21:58:44 compute-0 sudo[83791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:44 compute-0 python3.9[83793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:44 compute-0 sudo[83791]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:44 compute-0 sudo[83914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgtrnhianjmdxsthvhwpabqpfafefbdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119123.8572168-1323-243121573769703/AnsiballZ_copy.py'
Jan 22 21:58:44 compute-0 sudo[83914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:44 compute-0 python3.9[83916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119123.8572168-1323-243121573769703/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:45 compute-0 sudo[83914]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:45 compute-0 sudo[84066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnnxpynkheiolfdncnlykmrjbdnyidyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119125.21472-1344-172655674200327/AnsiballZ_file.py'
Jan 22 21:58:45 compute-0 sudo[84066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:45 compute-0 python3.9[84068]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:45 compute-0 sudo[84066]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:46 compute-0 sudo[84218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrzuiyfurczfhycjjsamfpjngqqweiii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119125.953704-1352-2900249265215/AnsiballZ_stat.py'
Jan 22 21:58:46 compute-0 sudo[84218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:46 compute-0 python3.9[84220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:58:46 compute-0 sudo[84218]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:47 compute-0 sudo[84341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkbrdgtkbyfevkllccqwbyrfckswlbca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119125.953704-1352-2900249265215/AnsiballZ_copy.py'
Jan 22 21:58:47 compute-0 sudo[84341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:47 compute-0 python3.9[84343]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119125.953704-1352-2900249265215/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:58:47 compute-0 sudo[84341]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:47 compute-0 sshd-session[76682]: Connection closed by 192.168.122.30 port 55040
Jan 22 21:58:47 compute-0 sshd-session[76679]: pam_unix(sshd:session): session closed for user zuul
Jan 22 21:58:47 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 22 21:58:47 compute-0 systemd[1]: session-18.scope: Consumed 35.146s CPU time.
Jan 22 21:58:47 compute-0 systemd-logind[801]: Session 18 logged out. Waiting for processes to exit.
Jan 22 21:58:47 compute-0 systemd-logind[801]: Removed session 18.
Jan 22 21:58:52 compute-0 sshd-session[84368]: Accepted publickey for zuul from 192.168.122.30 port 37274 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 21:58:52 compute-0 systemd-logind[801]: New session 19 of user zuul.
Jan 22 21:58:52 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 22 21:58:52 compute-0 sshd-session[84368]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 21:58:54 compute-0 python3.9[84521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:58:55 compute-0 sudo[84675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixnfwgkdewxoliiirtiauhbdrdtshiie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119134.6637573-62-35330315730533/AnsiballZ_file.py'
Jan 22 21:58:55 compute-0 sudo[84675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:55 compute-0 python3.9[84677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:55 compute-0 sudo[84675]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:55 compute-0 sudo[84827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocrgmzpzjqfjhqvhjmloovihvypruevy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119135.5139883-62-29083596302483/AnsiballZ_file.py'
Jan 22 21:58:55 compute-0 sudo[84827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:56 compute-0 python3.9[84829]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:58:56 compute-0 sudo[84827]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:56 compute-0 python3.9[84979]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:58:57 compute-0 sudo[85129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtakiaubxuiquhrbqlahyeulhlaxyrld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119137.1780527-131-87459292293971/AnsiballZ_seboolean.py'
Jan 22 21:58:57 compute-0 sudo[85129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:58:57 compute-0 python3.9[85131]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 21:58:59 compute-0 sudo[85129]: pam_unix(sudo:session): session closed for user root
Jan 22 21:58:59 compute-0 sudo[85285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mklficbwjyntdqpqydfstosuldrohpwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119139.500009-161-37165365543451/AnsiballZ_setup.py'
Jan 22 21:58:59 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 22 21:58:59 compute-0 sudo[85285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:00 compute-0 python3.9[85287]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 21:59:00 compute-0 sudo[85285]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:00 compute-0 sudo[85369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joxjrfpxhodhnzjzgvnrhfkkezvkgoxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119139.500009-161-37165365543451/AnsiballZ_dnf.py'
Jan 22 21:59:00 compute-0 sudo[85369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:01 compute-0 python3.9[85371]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 21:59:02 compute-0 sudo[85369]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:03 compute-0 sudo[85522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlblptymnhcekjqzxuucvlpfmmqrsxyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119142.5771363-197-180508637315565/AnsiballZ_systemd.py'
Jan 22 21:59:03 compute-0 sudo[85522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:03 compute-0 python3.9[85524]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 21:59:03 compute-0 sudo[85522]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:04 compute-0 sudo[85677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slbsebwdpemzyfuciymnkkewckroyvqx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119143.880422-221-17268688948707/AnsiballZ_edpm_nftables_snippet.py'
Jan 22 21:59:04 compute-0 sudo[85677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:04 compute-0 python3[85679]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 22 21:59:04 compute-0 sudo[85677]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:05 compute-0 sudo[85829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luckwpbghiohfinfknmpslzsithnqcog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119144.891975-248-135262742163334/AnsiballZ_file.py'
Jan 22 21:59:05 compute-0 sudo[85829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:05 compute-0 python3.9[85831]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:05 compute-0 sudo[85829]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:06 compute-0 sudo[85981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtlngscjevruliciukrebaxyjxrbqzfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119145.634376-272-245966871438662/AnsiballZ_stat.py'
Jan 22 21:59:06 compute-0 sudo[85981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:06 compute-0 python3.9[85983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:06 compute-0 sudo[85981]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:06 compute-0 sudo[86059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eykicjsegjdyxdlkgavtetlyqtsgpiwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119145.634376-272-245966871438662/AnsiballZ_file.py'
Jan 22 21:59:06 compute-0 sudo[86059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:07 compute-0 python3.9[86061]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:07 compute-0 sudo[86059]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:07 compute-0 sudo[86211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mondlxaoviunfngqxjexsnwfwckoppmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119147.3123312-308-262008507992424/AnsiballZ_stat.py'
Jan 22 21:59:07 compute-0 sudo[86211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:07 compute-0 python3.9[86213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:07 compute-0 sudo[86211]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:08 compute-0 sudo[86289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tolemfmrtzsxbqmlozzgpkpczocgkein ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119147.3123312-308-262008507992424/AnsiballZ_file.py'
Jan 22 21:59:08 compute-0 sudo[86289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:08 compute-0 python3.9[86291]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.l3qv69a4 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:08 compute-0 sudo[86289]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:09 compute-0 sudo[86441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwesdpnyouwbunevxcywhluhekrgzftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119148.744461-344-68378643908333/AnsiballZ_stat.py'
Jan 22 21:59:09 compute-0 sudo[86441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:09 compute-0 python3.9[86443]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:09 compute-0 sudo[86441]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:09 compute-0 sudo[86519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydpqtosfgpgohesbyqarjvvpclkbmxpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119148.744461-344-68378643908333/AnsiballZ_file.py'
Jan 22 21:59:09 compute-0 sudo[86519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:09 compute-0 python3.9[86521]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:09 compute-0 sudo[86519]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:10 compute-0 sudo[86671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqeskpsqflrolsquqfuvjwgqqpzxvvrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119150.1117485-383-174985414808363/AnsiballZ_command.py'
Jan 22 21:59:10 compute-0 sudo[86671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:10 compute-0 python3.9[86673]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:59:10 compute-0 sudo[86671]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:11 compute-0 sudo[86824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkiklarknllcxvprexdsvhwkuvpcthjs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119151.0186915-407-113872792507796/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 21:59:11 compute-0 sudo[86824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:11 compute-0 python3[86826]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 21:59:11 compute-0 sudo[86824]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:12 compute-0 sudo[86976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdljwdexisipeghvhmynfgftridxecse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119151.8803961-431-108350539582764/AnsiballZ_stat.py'
Jan 22 21:59:12 compute-0 sudo[86976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:12 compute-0 python3.9[86978]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:12 compute-0 sudo[86976]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:13 compute-0 sudo[87101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdgnhducuwdecpngeelmcumdphovfwpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119151.8803961-431-108350539582764/AnsiballZ_copy.py'
Jan 22 21:59:13 compute-0 sudo[87101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:13 compute-0 python3.9[87103]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119151.8803961-431-108350539582764/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:13 compute-0 sudo[87101]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:13 compute-0 sudo[87253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktdjoyafjainwqywhbdzxrzkzbtdeldm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119153.488888-476-115055710142296/AnsiballZ_stat.py'
Jan 22 21:59:13 compute-0 sudo[87253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:14 compute-0 python3.9[87255]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:14 compute-0 sudo[87253]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:14 compute-0 sudo[87378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgncjqwgkjzftjqklgyhnvqhdujfqfjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119153.488888-476-115055710142296/AnsiballZ_copy.py'
Jan 22 21:59:14 compute-0 sudo[87378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:14 compute-0 python3.9[87380]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119153.488888-476-115055710142296/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:14 compute-0 sudo[87378]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:15 compute-0 sudo[87530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaecxxhedellmwgergoxeizfejdzjhim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119154.9910197-521-201139376305542/AnsiballZ_stat.py'
Jan 22 21:59:15 compute-0 sudo[87530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:15 compute-0 python3.9[87532]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:15 compute-0 sudo[87530]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:16 compute-0 sudo[87655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgywmmvmppggqthwtsbdqnqqwpcqcyaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119154.9910197-521-201139376305542/AnsiballZ_copy.py'
Jan 22 21:59:16 compute-0 sudo[87655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:16 compute-0 python3.9[87657]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119154.9910197-521-201139376305542/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:16 compute-0 sudo[87655]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:16 compute-0 sudo[87807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnezrnlhkdmnppucpaquhougyndaoucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119156.4867592-566-173642607434952/AnsiballZ_stat.py'
Jan 22 21:59:16 compute-0 sudo[87807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:17 compute-0 python3.9[87809]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:17 compute-0 sudo[87807]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:17 compute-0 sudo[87932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtyynmlyxtnurcgpxhppbequeuzjstbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119156.4867592-566-173642607434952/AnsiballZ_copy.py'
Jan 22 21:59:17 compute-0 sudo[87932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:17 compute-0 python3.9[87934]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119156.4867592-566-173642607434952/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:17 compute-0 sudo[87932]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:18 compute-0 sudo[88084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owhepkhrlxwxhvviotdxnpjrocicnjld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119158.0145402-611-242367906924783/AnsiballZ_stat.py'
Jan 22 21:59:18 compute-0 sudo[88084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:18 compute-0 python3.9[88086]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:18 compute-0 sudo[88084]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:19 compute-0 sudo[88209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-molboyhawyvpxvwawyepcrihdtihrmfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119158.0145402-611-242367906924783/AnsiballZ_copy.py'
Jan 22 21:59:19 compute-0 sudo[88209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:19 compute-0 python3.9[88211]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119158.0145402-611-242367906924783/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:19 compute-0 sudo[88209]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:19 compute-0 sudo[88361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nraovfvwburchblalszmxemiotiniuuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119159.5634139-656-61037964167287/AnsiballZ_file.py'
Jan 22 21:59:19 compute-0 sudo[88361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:20 compute-0 python3.9[88363]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:20 compute-0 sudo[88361]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:20 compute-0 sudo[88513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvcbusqhbwbshpveuwwcgslfcnidwukn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119160.3456142-680-126145540012205/AnsiballZ_command.py'
Jan 22 21:59:20 compute-0 sudo[88513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:20 compute-0 python3.9[88515]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:59:20 compute-0 sudo[88513]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:21 compute-0 sudo[88668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fywpgnktwlytwjyjhlufchdebkxntotv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119161.1709676-704-13598773257123/AnsiballZ_blockinfile.py'
Jan 22 21:59:21 compute-0 sudo[88668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:21 compute-0 python3.9[88670]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:21 compute-0 sudo[88668]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:22 compute-0 sudo[88820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzlylgxbhlceaojarmmajjdtpidilppz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119162.1814876-731-115801473616682/AnsiballZ_command.py'
Jan 22 21:59:22 compute-0 sudo[88820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:22 compute-0 python3.9[88822]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:59:22 compute-0 sudo[88820]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:23 compute-0 sudo[88973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwsrpqdnbjkzrkokbkuqjypjrtpsaghx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119162.9916089-755-164816380886709/AnsiballZ_stat.py'
Jan 22 21:59:23 compute-0 sudo[88973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:23 compute-0 python3.9[88975]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:59:23 compute-0 sudo[88973]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:24 compute-0 sudo[89127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etzfqyvqafacgzhbeiruzupwdsqoeihv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119163.757607-779-75296616691054/AnsiballZ_command.py'
Jan 22 21:59:24 compute-0 sudo[89127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:24 compute-0 python3.9[89129]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:59:24 compute-0 sudo[89127]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:24 compute-0 sudo[89282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkgqhpvorfchulnzulpmthdpgieeswkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119164.5002668-803-6006268684604/AnsiballZ_file.py'
Jan 22 21:59:24 compute-0 sudo[89282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:25 compute-0 python3.9[89284]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:25 compute-0 sudo[89282]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:26 compute-0 python3.9[89434]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 21:59:27 compute-0 sudo[89585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ookhsurodimacwxitfgibcpaphsdhuvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119167.3247774-923-77066823513950/AnsiballZ_command.py'
Jan 22 21:59:27 compute-0 sudo[89585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:27 compute-0 python3.9[89587]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:59:27 compute-0 ovs-vsctl[89588]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 22 21:59:27 compute-0 sudo[89585]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:28 compute-0 sudo[89738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rddbdowzoemneumnsdzqidbvwhclnwfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119168.1486554-950-121461608688338/AnsiballZ_command.py'
Jan 22 21:59:28 compute-0 sudo[89738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:28 compute-0 python3.9[89740]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:59:28 compute-0 sudo[89738]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:29 compute-0 sudo[89893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvdnpczstqrqclybmdusifhckirprtux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119168.9267962-974-161952711971802/AnsiballZ_command.py'
Jan 22 21:59:29 compute-0 sudo[89893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:29 compute-0 python3.9[89895]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 21:59:29 compute-0 ovs-vsctl[89896]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 22 21:59:29 compute-0 sudo[89893]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:30 compute-0 python3.9[90046]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:59:30 compute-0 sudo[90198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sutaenwlucoakauulprnkiknwnvjjkhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119170.5164006-1025-139658015882477/AnsiballZ_file.py'
Jan 22 21:59:30 compute-0 sudo[90198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:31 compute-0 python3.9[90200]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:59:31 compute-0 sudo[90198]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:31 compute-0 sudo[90350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pofxdysftdtztnclbogdzqjdgjnxhmir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119171.3085182-1049-246617349933780/AnsiballZ_stat.py'
Jan 22 21:59:31 compute-0 sudo[90350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:32 compute-0 python3.9[90352]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:32 compute-0 sudo[90350]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:32 compute-0 sudo[90428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifgbmhymixpjoyxckjolkyxpitpaldhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119171.3085182-1049-246617349933780/AnsiballZ_file.py'
Jan 22 21:59:32 compute-0 sudo[90428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:32 compute-0 python3.9[90430]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:59:32 compute-0 sudo[90428]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:33 compute-0 sudo[90580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkmagttikucwzzzljdoidxwdbpvrylwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119172.7815714-1049-279036653331146/AnsiballZ_stat.py'
Jan 22 21:59:33 compute-0 sudo[90580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:33 compute-0 python3.9[90582]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:33 compute-0 sudo[90580]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:33 compute-0 sudo[90658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vprkdllppdubyasvfuahgcomuaoxrasf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119172.7815714-1049-279036653331146/AnsiballZ_file.py'
Jan 22 21:59:33 compute-0 sudo[90658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:33 compute-0 python3.9[90660]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:59:33 compute-0 sudo[90658]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:34 compute-0 sudo[90810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvvzusxferkcociylhnzzlibsxcpuvou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119174.1127079-1118-223440644816533/AnsiballZ_file.py'
Jan 22 21:59:34 compute-0 sudo[90810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:34 compute-0 python3.9[90812]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:34 compute-0 sudo[90810]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:35 compute-0 sudo[90962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixqmcjobyaibyjlmldmfibocfcizewxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119174.837064-1142-13720649956448/AnsiballZ_stat.py'
Jan 22 21:59:35 compute-0 sudo[90962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:35 compute-0 python3.9[90964]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:35 compute-0 sudo[90962]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:35 compute-0 sudo[91040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fikseoqqvvhfxhvgyfoeomexlrvlnaox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119174.837064-1142-13720649956448/AnsiballZ_file.py'
Jan 22 21:59:35 compute-0 sudo[91040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:35 compute-0 python3.9[91042]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:35 compute-0 sudo[91040]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:36 compute-0 sudo[91192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywfvqlkpshydmmhfgopasfhppxieuefq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119176.119144-1178-120734851721858/AnsiballZ_stat.py'
Jan 22 21:59:36 compute-0 sudo[91192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:36 compute-0 python3.9[91194]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:36 compute-0 sudo[91192]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:37 compute-0 sudo[91270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrhyjnizihzigxgmfsculmggpsbfdaqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119176.119144-1178-120734851721858/AnsiballZ_file.py'
Jan 22 21:59:37 compute-0 sudo[91270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:37 compute-0 python3.9[91272]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:37 compute-0 sudo[91270]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:37 compute-0 sudo[91422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdqlepfrshiyutgemnptwizlyvygpvyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119177.6016374-1214-180430200591929/AnsiballZ_systemd.py'
Jan 22 21:59:37 compute-0 sudo[91422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:38 compute-0 python3.9[91424]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:59:38 compute-0 systemd[1]: Reloading.
Jan 22 21:59:38 compute-0 systemd-rc-local-generator[91447]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:59:38 compute-0 systemd-sysv-generator[91450]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:59:38 compute-0 sudo[91422]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:39 compute-0 sudo[91612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddezsbaplqstzzprefgyluxhcniehjhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119178.827185-1238-187981869668979/AnsiballZ_stat.py'
Jan 22 21:59:39 compute-0 sudo[91612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:39 compute-0 python3.9[91614]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:39 compute-0 sudo[91612]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:39 compute-0 sudo[91690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzfedbobcmoxfipbdxpdqkgfjjomldmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119178.827185-1238-187981869668979/AnsiballZ_file.py'
Jan 22 21:59:39 compute-0 sudo[91690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:39 compute-0 python3.9[91692]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:39 compute-0 sudo[91690]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:40 compute-0 sudo[91842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwkgneuildmxreabpgicviisixakqwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119180.1714413-1274-165455373282146/AnsiballZ_stat.py'
Jan 22 21:59:40 compute-0 sudo[91842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:40 compute-0 python3.9[91844]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:40 compute-0 sudo[91842]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:41 compute-0 sudo[91920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwqljvmagvihrotsdidthytupdeqinej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119180.1714413-1274-165455373282146/AnsiballZ_file.py'
Jan 22 21:59:41 compute-0 sudo[91920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:41 compute-0 python3.9[91922]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:41 compute-0 sudo[91920]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:42 compute-0 sudo[92072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvsujseighuarjaorwzvkkchbhvkqzzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119181.7461312-1310-198215730947505/AnsiballZ_systemd.py'
Jan 22 21:59:42 compute-0 sudo[92072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:42 compute-0 python3.9[92074]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 21:59:42 compute-0 systemd[1]: Reloading.
Jan 22 21:59:42 compute-0 systemd-rc-local-generator[92097]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:59:42 compute-0 systemd-sysv-generator[92101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 21:59:42 compute-0 systemd[1]: Starting Create netns directory...
Jan 22 21:59:42 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 21:59:42 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 21:59:42 compute-0 systemd[1]: Finished Create netns directory.
Jan 22 21:59:42 compute-0 sudo[92072]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:43 compute-0 sudo[92267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixvteskbdhwafmuwinyzfqwvofnvobdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119183.215047-1340-105738229905847/AnsiballZ_file.py'
Jan 22 21:59:43 compute-0 sudo[92267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:43 compute-0 python3.9[92269]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:59:43 compute-0 sudo[92267]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:44 compute-0 sudo[92419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlrtdfgoxsrcgquutqkdriazmhggvzkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119183.947092-1364-41237956183658/AnsiballZ_stat.py'
Jan 22 21:59:44 compute-0 sudo[92419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:44 compute-0 python3.9[92421]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:44 compute-0 sudo[92419]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:45 compute-0 sudo[92542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvsfttqnmfmsepoeusilfyufnksildnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119183.947092-1364-41237956183658/AnsiballZ_copy.py'
Jan 22 21:59:45 compute-0 sudo[92542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:45 compute-0 python3.9[92544]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119183.947092-1364-41237956183658/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:59:45 compute-0 sudo[92542]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:46 compute-0 sudo[92694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueoeascccxbietectrmkiajebvclaehj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119185.8157406-1415-271383742361254/AnsiballZ_file.py'
Jan 22 21:59:46 compute-0 sudo[92694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:46 compute-0 python3.9[92696]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:46 compute-0 sudo[92694]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:46 compute-0 sudo[92846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmahxzvtnwcktdnkisthabqlxakzyjhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119186.5934207-1439-133987627205094/AnsiballZ_file.py'
Jan 22 21:59:47 compute-0 sudo[92846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:47 compute-0 python3.9[92848]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 21:59:47 compute-0 sudo[92846]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:47 compute-0 sudo[92998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drinivpfohoezhicxtrvzjiwkozlymwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119187.4847028-1463-82574080529530/AnsiballZ_stat.py'
Jan 22 21:59:47 compute-0 sudo[92998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:48 compute-0 python3.9[93000]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 21:59:48 compute-0 sudo[92998]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:48 compute-0 sudo[93121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kojnrqauscuosjxnpryocvutggjoftce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119187.4847028-1463-82574080529530/AnsiballZ_copy.py'
Jan 22 21:59:48 compute-0 sudo[93121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:48 compute-0 python3.9[93123]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119187.4847028-1463-82574080529530/.source.json _original_basename=.otj6k3ep follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:48 compute-0 sudo[93121]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:49 compute-0 python3.9[93273]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:52 compute-0 sudo[93694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikjewmhleeglxrdtkwyquyrmtfrnweko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119192.1298447-1583-25731662046062/AnsiballZ_container_config_data.py'
Jan 22 21:59:52 compute-0 sudo[93694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:53 compute-0 python3.9[93696]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 22 21:59:53 compute-0 sudo[93694]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:54 compute-0 sudo[93846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgukubkvbzwuylocnrmsbgyrjfzyiulo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119193.5107625-1616-190699478150855/AnsiballZ_container_config_hash.py'
Jan 22 21:59:54 compute-0 sudo[93846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:54 compute-0 python3.9[93848]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 21:59:54 compute-0 sudo[93846]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:55 compute-0 sudo[93998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvrplmgjjxjglznpcgnmzjzyziddvpld ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119194.6483817-1646-64464392467685/AnsiballZ_edpm_container_manage.py'
Jan 22 21:59:55 compute-0 sudo[93998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:55 compute-0 python3[94000]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 21:59:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:59:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:59:55 compute-0 podman[94036]: 2026-01-22 21:59:55.795837185 +0000 UTC m=+0.063109917 container create c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 21:59:55 compute-0 podman[94036]: 2026-01-22 21:59:55.760075148 +0000 UTC m=+0.027347930 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 21:59:55 compute-0 python3[94000]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=d88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 21:59:55 compute-0 sudo[93998]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 21:59:56 compute-0 sudo[94224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjycueiyqfpnbqvehzjtkgcifbonmzpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119196.3406546-1670-232210608122177/AnsiballZ_stat.py'
Jan 22 21:59:56 compute-0 sudo[94224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:56 compute-0 python3.9[94226]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:59:56 compute-0 sudo[94224]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:57 compute-0 sudo[94378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-outvfqxgrmkewofvxlzvubjaafakvanq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119197.344082-1697-262044259653964/AnsiballZ_file.py'
Jan 22 21:59:57 compute-0 sudo[94378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:57 compute-0 python3.9[94380]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:57 compute-0 sudo[94378]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:58 compute-0 sudo[94454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etboqdcjxgsxkutbmbacknznahapcgpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119197.344082-1697-262044259653964/AnsiballZ_stat.py'
Jan 22 21:59:58 compute-0 sudo[94454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:58 compute-0 python3.9[94456]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 21:59:58 compute-0 sudo[94454]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:58 compute-0 sudo[94605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoopidmgwgcvpwiwxesivaningoygqbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119198.4737947-1697-111902623271594/AnsiballZ_copy.py'
Jan 22 21:59:58 compute-0 sudo[94605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:59 compute-0 python3.9[94607]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119198.4737947-1697-111902623271594/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 21:59:59 compute-0 sudo[94605]: pam_unix(sudo:session): session closed for user root
Jan 22 21:59:59 compute-0 sudo[94681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klevynusjbtmrarymcutnayidlwdxynq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119198.4737947-1697-111902623271594/AnsiballZ_systemd.py'
Jan 22 21:59:59 compute-0 sudo[94681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 21:59:59 compute-0 python3.9[94683]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 21:59:59 compute-0 systemd[1]: Reloading.
Jan 22 21:59:59 compute-0 systemd-rc-local-generator[94705]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 21:59:59 compute-0 systemd-sysv-generator[94712]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:00:00 compute-0 sudo[94681]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:00 compute-0 sudo[94792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfpgpyeextkxqaltueahybsbaxwjnktm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119198.4737947-1697-111902623271594/AnsiballZ_systemd.py'
Jan 22 22:00:00 compute-0 sudo[94792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:00 compute-0 python3.9[94794]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:00:00 compute-0 systemd[1]: Reloading.
Jan 22 22:00:00 compute-0 systemd-rc-local-generator[94820]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:00:00 compute-0 systemd-sysv-generator[94827]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:00:00 compute-0 systemd[1]: Starting ovn_controller container...
Jan 22 22:00:01 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 22 22:00:01 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:00:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f80ce30ad44f1acb34af0a01e8a2661bf650c4f45d1358b20d9fe2d2ce46b944/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 22 22:00:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0.
Jan 22 22:00:01 compute-0 podman[94835]: 2026-01-22 22:00:01.174156685 +0000 UTC m=+0.164987573 container init c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + sudo -E kolla_set_configs
Jan 22 22:00:01 compute-0 podman[94835]: 2026-01-22 22:00:01.212963468 +0000 UTC m=+0.203794366 container start c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:00:01 compute-0 edpm-start-podman-container[94835]: ovn_controller
Jan 22 22:00:01 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 22 22:00:01 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 22 22:00:01 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 22 22:00:01 compute-0 edpm-start-podman-container[94834]: Creating additional drop-in dependency for "ovn_controller" (c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0)
Jan 22 22:00:01 compute-0 podman[94857]: 2026-01-22 22:00:01.308838056 +0000 UTC m=+0.087637915 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:00:01 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 22 22:00:01 compute-0 systemd[1]: c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0-b71c865cf4eee3e.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 22:00:01 compute-0 systemd[1]: c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0-b71c865cf4eee3e.service: Failed with result 'exit-code'.
Jan 22 22:00:01 compute-0 systemd[94889]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 22 22:00:01 compute-0 systemd[1]: Reloading.
Jan 22 22:00:01 compute-0 systemd-sysv-generator[94937]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:00:01 compute-0 systemd-rc-local-generator[94934]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:00:01 compute-0 systemd[94889]: Queued start job for default target Main User Target.
Jan 22 22:00:01 compute-0 systemd[94889]: Created slice User Application Slice.
Jan 22 22:00:01 compute-0 systemd[94889]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 22 22:00:01 compute-0 systemd[94889]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:00:01 compute-0 systemd[94889]: Reached target Paths.
Jan 22 22:00:01 compute-0 systemd[94889]: Reached target Timers.
Jan 22 22:00:01 compute-0 systemd[94889]: Starting D-Bus User Message Bus Socket...
Jan 22 22:00:01 compute-0 systemd[94889]: Starting Create User's Volatile Files and Directories...
Jan 22 22:00:01 compute-0 systemd[94889]: Finished Create User's Volatile Files and Directories.
Jan 22 22:00:01 compute-0 systemd[94889]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:00:01 compute-0 systemd[94889]: Reached target Sockets.
Jan 22 22:00:01 compute-0 systemd[94889]: Reached target Basic System.
Jan 22 22:00:01 compute-0 systemd[94889]: Reached target Main User Target.
Jan 22 22:00:01 compute-0 systemd[94889]: Startup finished in 148ms.
Jan 22 22:00:01 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 22 22:00:01 compute-0 systemd[1]: Started ovn_controller container.
Jan 22 22:00:01 compute-0 systemd[1]: Started Session c1 of User root.
Jan 22 22:00:01 compute-0 sudo[94792]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:01 compute-0 ovn_controller[94850]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 22:00:01 compute-0 ovn_controller[94850]: INFO:__main__:Validating config file
Jan 22 22:00:01 compute-0 ovn_controller[94850]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 22:00:01 compute-0 ovn_controller[94850]: INFO:__main__:Writing out command to execute
Jan 22 22:00:01 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 22 22:00:01 compute-0 ovn_controller[94850]: ++ cat /run_command
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + ARGS=
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + sudo kolla_copy_cacerts
Jan 22 22:00:01 compute-0 systemd[1]: Started Session c2 of User root.
Jan 22 22:00:01 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + [[ ! -n '' ]]
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + . kolla_extend_start
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 22 22:00:01 compute-0 ovn_controller[94850]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + umask 0022
Jan 22 22:00:01 compute-0 ovn_controller[94850]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7712] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7724] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <warn>  [1769119201.7728] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7739] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7749] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7755] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 22:00:01 compute-0 kernel: br-int: entered promiscuous mode
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00024|main|INFO|OVS feature set changed, force recompute.
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 22:00:01 compute-0 ovn_controller[94850]: 2026-01-22T22:00:01Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7924] manager: (ovn-bdc194-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7942] manager: (ovn-b36a49-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.7965] manager: (ovn-e130c2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 22 22:00:01 compute-0 systemd-udevd[94981]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:00:01 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 22 22:00:01 compute-0 systemd-udevd[94983]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.8196] device (genev_sys_6081): carrier: link connected
Jan 22 22:00:01 compute-0 NetworkManager[54954]: <info>  [1769119201.8202] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Jan 22 22:00:02 compute-0 python3.9[95111]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 22:00:03 compute-0 sudo[95261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuousxdufwevbxhfpoauylcswdqvjhbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119203.2668262-1832-117755656363070/AnsiballZ_stat.py'
Jan 22 22:00:03 compute-0 sudo[95261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:03 compute-0 python3.9[95263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:03 compute-0 sudo[95261]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:04 compute-0 sudo[95384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsudycmfupcydpqirqhlkhdkutmkenvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119203.2668262-1832-117755656363070/AnsiballZ_copy.py'
Jan 22 22:00:04 compute-0 sudo[95384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:04 compute-0 python3.9[95386]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119203.2668262-1832-117755656363070/.source.yaml _original_basename=.l1l92_o7 follow=False checksum=4c948b03318d4125f88ccbb3951023e44c0d629c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:04 compute-0 sudo[95384]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:05 compute-0 sudo[95536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvsgnkaczcrfvukkedchiwugnmfauuzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119204.7214904-1877-269932491562513/AnsiballZ_command.py'
Jan 22 22:00:05 compute-0 sudo[95536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:05 compute-0 python3.9[95538]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:00:05 compute-0 ovs-vsctl[95539]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 22 22:00:05 compute-0 sudo[95536]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:05 compute-0 sudo[95689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqvuwqmntnmwswgoxslwmlmlzljguybb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119205.5187151-1901-57060020050587/AnsiballZ_command.py'
Jan 22 22:00:05 compute-0 sudo[95689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:06 compute-0 python3.9[95691]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:00:06 compute-0 ovs-vsctl[95693]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 22 22:00:06 compute-0 sudo[95689]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:06 compute-0 sudo[95844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zebfpozndbidbilmnzeohcnhwuadhzfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119206.6069102-1943-17885886069155/AnsiballZ_command.py'
Jan 22 22:00:06 compute-0 sudo[95844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:07 compute-0 python3.9[95846]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:00:07 compute-0 ovs-vsctl[95847]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 22 22:00:07 compute-0 sudo[95844]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:07 compute-0 sshd-session[84371]: Connection closed by 192.168.122.30 port 37274
Jan 22 22:00:07 compute-0 sshd-session[84368]: pam_unix(sshd:session): session closed for user zuul
Jan 22 22:00:07 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 22 22:00:07 compute-0 systemd[1]: session-19.scope: Consumed 56.102s CPU time.
Jan 22 22:00:07 compute-0 systemd-logind[801]: Session 19 logged out. Waiting for processes to exit.
Jan 22 22:00:07 compute-0 systemd-logind[801]: Removed session 19.
Jan 22 22:00:11 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 22 22:00:11 compute-0 systemd[94889]: Activating special unit Exit the Session...
Jan 22 22:00:11 compute-0 systemd[94889]: Stopped target Main User Target.
Jan 22 22:00:11 compute-0 systemd[94889]: Stopped target Basic System.
Jan 22 22:00:11 compute-0 systemd[94889]: Stopped target Paths.
Jan 22 22:00:11 compute-0 systemd[94889]: Stopped target Sockets.
Jan 22 22:00:11 compute-0 systemd[94889]: Stopped target Timers.
Jan 22 22:00:11 compute-0 systemd[94889]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:00:11 compute-0 systemd[94889]: Closed D-Bus User Message Bus Socket.
Jan 22 22:00:11 compute-0 systemd[94889]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:00:11 compute-0 systemd[94889]: Removed slice User Application Slice.
Jan 22 22:00:11 compute-0 systemd[94889]: Reached target Shutdown.
Jan 22 22:00:11 compute-0 systemd[94889]: Finished Exit the Session.
Jan 22 22:00:11 compute-0 systemd[94889]: Reached target Exit the Session.
Jan 22 22:00:11 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 22 22:00:11 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 22 22:00:11 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 22 22:00:11 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 22 22:00:11 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 22 22:00:11 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 22 22:00:11 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 22 22:00:12 compute-0 sshd-session[95875]: Accepted publickey for zuul from 192.168.122.30 port 37962 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 22:00:12 compute-0 systemd-logind[801]: New session 21 of user zuul.
Jan 22 22:00:12 compute-0 systemd[1]: Started Session 21 of User zuul.
Jan 22 22:00:12 compute-0 sshd-session[95875]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 22:00:13 compute-0 python3.9[96028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 22:00:14 compute-0 sudo[96182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcnmfxhvjfhziekyfldgcjyooxgtbiwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119214.4559941-62-218119027373374/AnsiballZ_file.py'
Jan 22 22:00:14 compute-0 sudo[96182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:15 compute-0 python3.9[96184]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:15 compute-0 sudo[96182]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:15 compute-0 sudo[96334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jehrjdxcckjkabbnoqyiakpfqaaundbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119215.3551629-62-216243191882215/AnsiballZ_file.py'
Jan 22 22:00:15 compute-0 sudo[96334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:15 compute-0 python3.9[96336]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:15 compute-0 sudo[96334]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:16 compute-0 sudo[96486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhjjgtxskvuwqkrmvascncpcqghtfoxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119216.1662636-62-249286470115524/AnsiballZ_file.py'
Jan 22 22:00:16 compute-0 sudo[96486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:16 compute-0 python3.9[96488]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:16 compute-0 sudo[96486]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:17 compute-0 sudo[96638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfvkfgyrszquhbbhvdmijsvnvnlzkqlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119217.027237-62-87952444115026/AnsiballZ_file.py'
Jan 22 22:00:17 compute-0 sudo[96638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:17 compute-0 python3.9[96640]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:17 compute-0 sudo[96638]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:18 compute-0 sudo[96790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxqdszkqqmhwdbptndukdkugprjxcwwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119217.7888353-62-260209674021777/AnsiballZ_file.py'
Jan 22 22:00:18 compute-0 sudo[96790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:18 compute-0 python3.9[96792]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:18 compute-0 sudo[96790]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:19 compute-0 python3.9[96942]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 22:00:19 compute-0 sudo[97093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxuffqzeeoijdxwuzgrabborexuhyijw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119219.3635125-194-88411704259144/AnsiballZ_seboolean.py'
Jan 22 22:00:19 compute-0 sudo[97093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:20 compute-0 python3.9[97095]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 22:00:20 compute-0 sudo[97093]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:21 compute-0 python3.9[97245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:22 compute-0 python3.9[97366]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119220.9075136-218-167846826248560/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:22 compute-0 python3.9[97516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:23 compute-0 python3.9[97637]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119222.3708615-263-65112935663809/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:24 compute-0 sudo[97787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tirrbgfaonjbnlytixfwlyndzpwtdmyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119223.850523-314-200608262296951/AnsiballZ_setup.py'
Jan 22 22:00:24 compute-0 sudo[97787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:24 compute-0 python3.9[97789]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 22:00:24 compute-0 sudo[97787]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:25 compute-0 sudo[97871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfciwdxjhsquysujkdvgpxwldlgtzacz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119223.850523-314-200608262296951/AnsiballZ_dnf.py'
Jan 22 22:00:25 compute-0 sudo[97871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:25 compute-0 python3.9[97873]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 22:00:26 compute-0 sudo[97871]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:27 compute-0 sudo[98024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcaeqiyyycprqmascpsjreoegdimoryq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119226.928819-350-50443463425393/AnsiballZ_systemd.py'
Jan 22 22:00:27 compute-0 sudo[98024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:27 compute-0 python3.9[98026]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 22:00:28 compute-0 sudo[98024]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:28 compute-0 python3.9[98179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:29 compute-0 python3.9[98300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119228.2192364-374-235274421614165/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:30 compute-0 python3.9[98450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:30 compute-0 python3.9[98571]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119229.5843651-374-262112789062684/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:31 compute-0 ovn_controller[94850]: 2026-01-22T22:00:31Z|00025|memory|INFO|16256 kB peak resident set size after 30.1 seconds
Jan 22 22:00:31 compute-0 ovn_controller[94850]: 2026-01-22T22:00:31Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 22 22:00:31 compute-0 podman[98695]: 2026-01-22 22:00:31.864240254 +0000 UTC m=+0.111382413 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:00:31 compute-0 python3.9[98734]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:32 compute-0 python3.9[98868]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119231.469268-506-95760645876204/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:33 compute-0 python3.9[99018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:33 compute-0 python3.9[99139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119232.7215164-506-23383040920532/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:34 compute-0 python3.9[99289]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:00:35 compute-0 sudo[99441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohyzyounleezigakixxczlkpyfwlhool ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119235.01917-620-36556302590941/AnsiballZ_file.py'
Jan 22 22:00:35 compute-0 sudo[99441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:35 compute-0 python3.9[99443]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:35 compute-0 sudo[99441]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:36 compute-0 sudo[99593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hupceobjiemophvcatyhslmdhfgtgvyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119235.742459-644-47467098191263/AnsiballZ_stat.py'
Jan 22 22:00:36 compute-0 sudo[99593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:36 compute-0 python3.9[99595]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:36 compute-0 sudo[99593]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:36 compute-0 sudo[99671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzyaeuiqydshuvssznbuhvngksfxkcpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119235.742459-644-47467098191263/AnsiballZ_file.py'
Jan 22 22:00:36 compute-0 sudo[99671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:36 compute-0 python3.9[99673]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:36 compute-0 sudo[99671]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:37 compute-0 sudo[99823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycvjjnoxiasrbfcryatpzytybgvduckr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119236.8825197-644-156198045647859/AnsiballZ_stat.py'
Jan 22 22:00:37 compute-0 sudo[99823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:37 compute-0 python3.9[99825]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:37 compute-0 sudo[99823]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:37 compute-0 sudo[99901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjeyxebwqncwnlbeqisfsahsklaiydfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119236.8825197-644-156198045647859/AnsiballZ_file.py'
Jan 22 22:00:37 compute-0 sudo[99901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:37 compute-0 python3.9[99903]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:37 compute-0 sudo[99901]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:38 compute-0 sudo[100053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rleoejbiznqhdqmvkcdkjgfaaulrubai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119238.243491-713-167296760816281/AnsiballZ_file.py'
Jan 22 22:00:38 compute-0 sudo[100053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:38 compute-0 python3.9[100055]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:38 compute-0 sudo[100053]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:39 compute-0 sudo[100205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiruogazfvhuezdgiyxdgksgxidcwmww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119239.1592352-737-235309118786330/AnsiballZ_stat.py'
Jan 22 22:00:39 compute-0 sudo[100205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:39 compute-0 python3.9[100207]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:39 compute-0 sudo[100205]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:39 compute-0 sudo[100283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxhkgswhbsggntzusevziwvbsykacyml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119239.1592352-737-235309118786330/AnsiballZ_file.py'
Jan 22 22:00:39 compute-0 sudo[100283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:40 compute-0 python3.9[100285]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:40 compute-0 sudo[100283]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:40 compute-0 sudo[100435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vveutrnsiezvkiteeebrnsvjgpboldcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119240.3714824-773-234957767233553/AnsiballZ_stat.py'
Jan 22 22:00:40 compute-0 sudo[100435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:40 compute-0 python3.9[100437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:40 compute-0 sudo[100435]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:41 compute-0 sudo[100513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqzzqbrgdyrylytngspxtnpjsrapkdmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119240.3714824-773-234957767233553/AnsiballZ_file.py'
Jan 22 22:00:41 compute-0 sudo[100513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:41 compute-0 python3.9[100515]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:41 compute-0 sudo[100513]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:41 compute-0 sudo[100665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izutzewojeohzasnjnuskdyrtbdoyeyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119241.6142058-809-15944555159857/AnsiballZ_systemd.py'
Jan 22 22:00:41 compute-0 sudo[100665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:42 compute-0 python3.9[100667]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:00:42 compute-0 systemd[1]: Reloading.
Jan 22 22:00:42 compute-0 systemd-rc-local-generator[100691]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:00:42 compute-0 systemd-sysv-generator[100698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:00:42 compute-0 sudo[100665]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:43 compute-0 sudo[100854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grwpgkvghjbfipxgyzjtdbaytzvgvbvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119242.9096246-833-255139894626792/AnsiballZ_stat.py'
Jan 22 22:00:43 compute-0 sudo[100854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:43 compute-0 python3.9[100856]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:43 compute-0 sudo[100854]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:43 compute-0 sudo[100932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-docpemqigosfdpeuwyvhnbthidpydldm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119242.9096246-833-255139894626792/AnsiballZ_file.py'
Jan 22 22:00:43 compute-0 sudo[100932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:43 compute-0 python3.9[100934]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:43 compute-0 sudo[100932]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:44 compute-0 sudo[101084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjlulzsthhuzlxxltryjgzpzdxgutedf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119244.1304202-869-267129812538080/AnsiballZ_stat.py'
Jan 22 22:00:44 compute-0 sudo[101084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:44 compute-0 python3.9[101086]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:44 compute-0 sudo[101084]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:44 compute-0 sudo[101162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqghgzwfbssedkewuvzlmzlrqzupbpvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119244.1304202-869-267129812538080/AnsiballZ_file.py'
Jan 22 22:00:44 compute-0 sudo[101162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:44 compute-0 python3.9[101164]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:45 compute-0 sudo[101162]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:45 compute-0 sudo[101314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghrtclowqsdgxvxqyuunssbavtfmgfmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119245.3721292-905-114632852374250/AnsiballZ_systemd.py'
Jan 22 22:00:45 compute-0 sudo[101314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:45 compute-0 python3.9[101316]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:00:45 compute-0 systemd[1]: Reloading.
Jan 22 22:00:46 compute-0 systemd-rc-local-generator[101343]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:00:46 compute-0 systemd-sysv-generator[101348]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:00:46 compute-0 systemd[1]: Starting Create netns directory...
Jan 22 22:00:46 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 22:00:46 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 22:00:46 compute-0 systemd[1]: Finished Create netns directory.
Jan 22 22:00:46 compute-0 sudo[101314]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:46 compute-0 sudo[101508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqbttakxthicctemqdqfhqyycswaqjay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119246.7177548-935-127033975056247/AnsiballZ_file.py'
Jan 22 22:00:46 compute-0 sudo[101508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:47 compute-0 python3.9[101510]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:47 compute-0 sudo[101508]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:47 compute-0 sudo[101660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjerqhyrftvnownzaeoxwiwiiqrbzkom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119247.529728-959-209502715397837/AnsiballZ_stat.py'
Jan 22 22:00:47 compute-0 sudo[101660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:48 compute-0 python3.9[101662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:48 compute-0 sudo[101660]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:48 compute-0 sudo[101783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atcrnfubcdffjrejzsmhyfwadqxsvflp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119247.529728-959-209502715397837/AnsiballZ_copy.py'
Jan 22 22:00:48 compute-0 sudo[101783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:48 compute-0 python3.9[101785]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119247.529728-959-209502715397837/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:48 compute-0 sudo[101783]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:49 compute-0 sudo[101935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieaqwtkfgeccooneansfmbhbgzwfvcyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119249.1036606-1010-239089905274830/AnsiballZ_file.py'
Jan 22 22:00:49 compute-0 sudo[101935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:49 compute-0 python3.9[101937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:49 compute-0 sudo[101935]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:50 compute-0 sudo[102087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omcapqzgudnbeaqpqxdbvjqehtrnldya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119249.7979488-1034-112658104498906/AnsiballZ_file.py'
Jan 22 22:00:50 compute-0 sudo[102087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:50 compute-0 python3.9[102089]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:00:50 compute-0 sudo[102087]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:50 compute-0 sudo[102239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzmwhfztlfylldowypuosolflrvpjlfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119250.5681188-1058-190668295281807/AnsiballZ_stat.py'
Jan 22 22:00:50 compute-0 sudo[102239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:51 compute-0 python3.9[102241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:00:51 compute-0 sudo[102239]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:51 compute-0 sudo[102362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbvfhafreppppkdsubaglgrdvkerxrun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119250.5681188-1058-190668295281807/AnsiballZ_copy.py'
Jan 22 22:00:51 compute-0 sudo[102362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:51 compute-0 python3.9[102364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119250.5681188-1058-190668295281807/.source.json _original_basename=.v0p1ykfk follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:51 compute-0 sudo[102362]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:52 compute-0 python3.9[102514]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:00:54 compute-0 sudo[102935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjehhgjdiabmhamrcezvekabdidtwncb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119254.2981875-1178-189878701565106/AnsiballZ_container_config_data.py'
Jan 22 22:00:54 compute-0 sudo[102935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:54 compute-0 python3.9[102937]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 22 22:00:54 compute-0 sudo[102935]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:55 compute-0 sudo[103087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndtasfezhsulozregdsaubllcyazhryd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119255.400452-1211-29994920051131/AnsiballZ_container_config_hash.py'
Jan 22 22:00:55 compute-0 sudo[103087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:56 compute-0 python3.9[103089]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 22:00:56 compute-0 sudo[103087]: pam_unix(sudo:session): session closed for user root
Jan 22 22:00:57 compute-0 sudo[103239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnhtzaqhbydazafyuxsakvkuwmawoozk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119256.5087368-1241-196430694886543/AnsiballZ_edpm_container_manage.py'
Jan 22 22:00:57 compute-0 sudo[103239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:00:57 compute-0 python3[103241]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 22:01:01 compute-0 CROND[103300]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 22:01:01 compute-0 run-parts[103303]: (/etc/cron.hourly) starting 0anacron
Jan 22 22:01:01 compute-0 anacron[103311]: Anacron started on 2026-01-22
Jan 22 22:01:01 compute-0 anacron[103311]: Will run job `cron.daily' in 36 min.
Jan 22 22:01:01 compute-0 anacron[103311]: Will run job `cron.weekly' in 56 min.
Jan 22 22:01:01 compute-0 anacron[103311]: Will run job `cron.monthly' in 76 min.
Jan 22 22:01:01 compute-0 anacron[103311]: Jobs will be executed sequentially
Jan 22 22:01:01 compute-0 run-parts[103313]: (/etc/cron.hourly) finished 0anacron
Jan 22 22:01:01 compute-0 CROND[103299]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 22:01:03 compute-0 podman[103314]: 2026-01-22 22:01:03.088970767 +0000 UTC m=+1.009540585 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 22 22:01:03 compute-0 sshd-session[103347]: error: kex_exchange_identification: read: Connection reset by peer
Jan 22 22:01:03 compute-0 sshd-session[103347]: Connection reset by 176.120.22.52 port 64987
Jan 22 22:01:04 compute-0 podman[103254]: 2026-01-22 22:01:04.85569355 +0000 UTC m=+7.429104232 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:01:05 compute-0 podman[103396]: 2026-01-22 22:01:05.12241711 +0000 UTC m=+0.088599713 container create 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:01:05 compute-0 podman[103396]: 2026-01-22 22:01:05.074337694 +0000 UTC m=+0.040520367 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:01:05 compute-0 python3[103241]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=d88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:01:05 compute-0 sudo[103239]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:05 compute-0 sudo[103584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-livpeafipjhheqkokpjwqdhhdpamkrou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119265.547337-1265-244491147942591/AnsiballZ_stat.py'
Jan 22 22:01:05 compute-0 sudo[103584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:06 compute-0 python3.9[103586]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:01:06 compute-0 sudo[103584]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:06 compute-0 sudo[103738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fckqmhhjgczmrtxggrmanztsraailwjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119266.4917853-1292-33914669963364/AnsiballZ_file.py'
Jan 22 22:01:06 compute-0 sudo[103738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:07 compute-0 python3.9[103740]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:07 compute-0 sudo[103738]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:07 compute-0 sudo[103814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hreipgtxwzjhxbzfkttimyluhdbyzund ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119266.4917853-1292-33914669963364/AnsiballZ_stat.py'
Jan 22 22:01:07 compute-0 sudo[103814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:07 compute-0 python3.9[103816]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:01:07 compute-0 sudo[103814]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:08 compute-0 sudo[103965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlygcdhsexoonjfcozbzmotpasbedbyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119267.7260792-1292-235637079911869/AnsiballZ_copy.py'
Jan 22 22:01:08 compute-0 sudo[103965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:08 compute-0 python3.9[103967]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119267.7260792-1292-235637079911869/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:08 compute-0 sudo[103965]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:08 compute-0 sudo[104041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oekjmmgrnwzcnitqxradcivukdrznkhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119267.7260792-1292-235637079911869/AnsiballZ_systemd.py'
Jan 22 22:01:08 compute-0 sudo[104041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:09 compute-0 python3.9[104043]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:01:09 compute-0 systemd[1]: Reloading.
Jan 22 22:01:09 compute-0 systemd-sysv-generator[104070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:01:09 compute-0 systemd-rc-local-generator[104065]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:01:09 compute-0 sudo[104041]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:09 compute-0 sudo[104152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bydnnablgaxdufiyycdnwiqquxxaujrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119267.7260792-1292-235637079911869/AnsiballZ_systemd.py'
Jan 22 22:01:09 compute-0 sudo[104152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:09 compute-0 python3.9[104154]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:10 compute-0 systemd[1]: Reloading.
Jan 22 22:01:10 compute-0 systemd-sysv-generator[104187]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:01:10 compute-0 systemd-rc-local-generator[104183]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:01:10 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 22 22:01:10 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:01:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3d4ae54cf347c72935452a253c605bd2e45290598f918c2738258e214e2fade/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 22 22:01:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3d4ae54cf347c72935452a253c605bd2e45290598f918c2738258e214e2fade/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:01:10 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462.
Jan 22 22:01:10 compute-0 podman[104195]: 2026-01-22 22:01:10.486046406 +0000 UTC m=+0.175518006 container init 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + sudo -E kolla_set_configs
Jan 22 22:01:10 compute-0 podman[104195]: 2026-01-22 22:01:10.52688861 +0000 UTC m=+0.216360250 container start 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 22:01:10 compute-0 edpm-start-podman-container[104195]: ovn_metadata_agent
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Validating config file
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Copying service configuration files
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Writing out command to execute
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 22 22:01:10 compute-0 edpm-start-podman-container[104194]: Creating additional drop-in dependency for "ovn_metadata_agent" (7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462)
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: ++ cat /run_command
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + CMD=neutron-ovn-metadata-agent
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + ARGS=
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + sudo kolla_copy_cacerts
Jan 22 22:01:10 compute-0 podman[104217]: 2026-01-22 22:01:10.6228386 +0000 UTC m=+0.082839186 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:01:10 compute-0 systemd[1]: Reloading.
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + [[ ! -n '' ]]
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + . kolla_extend_start
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: Running command: 'neutron-ovn-metadata-agent'
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + umask 0022
Jan 22 22:01:10 compute-0 ovn_metadata_agent[104210]: + exec neutron-ovn-metadata-agent
Jan 22 22:01:10 compute-0 systemd-rc-local-generator[104290]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:01:10 compute-0 systemd-sysv-generator[104294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:01:10 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 22 22:01:10 compute-0 sudo[104152]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:11 compute-0 python3.9[104448]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.359 104215 INFO neutron.common.config [-] Logging enabled!
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.359 104215 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.359 104215 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.360 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.360 104215 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.360 104215 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.360 104215 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.361 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.361 104215 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.361 104215 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.361 104215 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.361 104215 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.361 104215 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.361 104215 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.362 104215 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.363 104215 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.364 104215 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.365 104215 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.365 104215 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.365 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.365 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.365 104215 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.365 104215 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.365 104215 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.366 104215 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.367 104215 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.367 104215 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.367 104215 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.367 104215 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.367 104215 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.367 104215 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.367 104215 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.368 104215 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.368 104215 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.368 104215 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.368 104215 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.368 104215 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.368 104215 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.368 104215 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.369 104215 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.370 104215 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.371 104215 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.372 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.373 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.374 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.375 104215 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.376 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.377 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.378 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.379 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.380 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.381 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.382 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.382 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.382 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.382 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.382 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.382 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.382 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.383 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.384 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.385 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.385 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.385 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.385 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.385 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.385 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.385 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.386 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.386 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.386 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.386 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.386 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.386 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.386 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.387 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.388 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.389 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.390 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.391 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.392 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.393 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.394 104215 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.405 104215 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.405 104215 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.405 104215 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.406 104215 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.406 104215 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.420 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6 (UUID: 67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.441 104215 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.441 104215 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.441 104215 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.442 104215 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.444 104215 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.450 104215 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.456 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], external_ids={}, name=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, nb_cfg_timestamp=1769119209789, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.456 104215 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fa026188b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.457 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.457 104215 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.458 104215 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.458 104215 INFO oslo_service.service [-] Starting 1 workers
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.462 104215 DEBUG oslo_service.service [-] Started child 104473 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.465 104215 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp5x35iyon/privsep.sock']
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.467 104473 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-163741'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.498 104473 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.498 104473 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.499 104473 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.503 104473 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.510 104473 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 22:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:12.521 104473 INFO eventlet.wsgi.server [-] (104473) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 22 22:01:12 compute-0 sudo[104603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pewxnyrmbhpvsjlprzufqhqsdpwyiuse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119272.552333-1427-188374253623858/AnsiballZ_stat.py'
Jan 22 22:01:12 compute-0 sudo[104603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:13 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.141 104215 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.142 104215 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp5x35iyon/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.027 104606 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.034 104606 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.038 104606 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.038 104606 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104606
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.144 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[fff64a76-51d4-438d-90cc-8911d6bf2bcb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:01:13 compute-0 python3.9[104605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:01:13 compute-0 sudo[104603]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.617 104606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.617 104606 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:13.617 104606 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:01:13 compute-0 sudo[104733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvpmdnxlcbxucsyuiqtskmhciqctdrdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119272.552333-1427-188374253623858/AnsiballZ_copy.py'
Jan 22 22:01:13 compute-0 sudo[104733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:14 compute-0 python3.9[104735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119272.552333-1427-188374253623858/.source.yaml _original_basename=.pigt2g38 follow=False checksum=e950e86bcd2f4ea172cbb26957801b4267cffda4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:14 compute-0 sudo[104733]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.130 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4dafb4ef-5741-4f11-b08e-395a8c8cf409]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.133 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, column=external_ids, values=({'neutron:ovn-metadata-id': '273f2177-08d5-5f73-9bed-8dfffa191625'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.346 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.354 104215 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.354 104215 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.355 104215 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.355 104215 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.355 104215 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.355 104215 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.356 104215 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.356 104215 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.357 104215 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.357 104215 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.357 104215 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.358 104215 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.358 104215 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.358 104215 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.359 104215 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.359 104215 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.359 104215 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.359 104215 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.360 104215 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.360 104215 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.360 104215 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.360 104215 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.361 104215 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.361 104215 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.361 104215 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.361 104215 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.362 104215 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.362 104215 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.362 104215 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.363 104215 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.363 104215 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.363 104215 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.363 104215 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.364 104215 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.364 104215 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.364 104215 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.364 104215 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.365 104215 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.365 104215 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.365 104215 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.366 104215 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.366 104215 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.366 104215 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.366 104215 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.367 104215 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.367 104215 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.367 104215 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.367 104215 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.367 104215 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.368 104215 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.368 104215 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.368 104215 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.368 104215 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.368 104215 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.369 104215 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.369 104215 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.369 104215 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.369 104215 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.370 104215 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.370 104215 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.370 104215 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.370 104215 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.371 104215 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.371 104215 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.371 104215 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.372 104215 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.372 104215 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.372 104215 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.372 104215 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.373 104215 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.373 104215 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.373 104215 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.373 104215 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.374 104215 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.374 104215 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.374 104215 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.374 104215 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.375 104215 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.375 104215 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.375 104215 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.375 104215 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.376 104215 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.376 104215 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.376 104215 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.376 104215 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.377 104215 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.377 104215 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.377 104215 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.377 104215 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.378 104215 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.378 104215 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.378 104215 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.378 104215 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.378 104215 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.379 104215 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.379 104215 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.379 104215 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.379 104215 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.379 104215 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.380 104215 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.380 104215 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.380 104215 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.380 104215 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.380 104215 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.381 104215 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.381 104215 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.381 104215 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.381 104215 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.382 104215 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.382 104215 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.382 104215 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.382 104215 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.383 104215 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.383 104215 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.383 104215 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.383 104215 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.384 104215 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.384 104215 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.384 104215 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.384 104215 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.384 104215 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.385 104215 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.385 104215 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.385 104215 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.385 104215 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.386 104215 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.386 104215 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.386 104215 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.386 104215 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.387 104215 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.387 104215 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.387 104215 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.387 104215 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.388 104215 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.388 104215 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.388 104215 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.388 104215 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.389 104215 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.389 104215 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.389 104215 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.389 104215 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.390 104215 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.390 104215 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.390 104215 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.390 104215 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.390 104215 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.391 104215 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.391 104215 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.391 104215 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.391 104215 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.391 104215 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.392 104215 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.392 104215 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.392 104215 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.392 104215 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.392 104215 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.393 104215 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.393 104215 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.393 104215 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.393 104215 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.393 104215 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.394 104215 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.394 104215 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.394 104215 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.395 104215 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.395 104215 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.395 104215 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.395 104215 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.396 104215 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.396 104215 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.396 104215 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.396 104215 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.397 104215 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.397 104215 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.397 104215 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.397 104215 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.398 104215 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.398 104215 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.398 104215 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.398 104215 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.398 104215 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.398 104215 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.398 104215 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.399 104215 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.399 104215 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.399 104215 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.399 104215 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.399 104215 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.399 104215 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.400 104215 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.400 104215 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.400 104215 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.400 104215 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.400 104215 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.400 104215 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.400 104215 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.401 104215 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.401 104215 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.401 104215 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.401 104215 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.401 104215 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.401 104215 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.401 104215 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.402 104215 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.402 104215 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.402 104215 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.402 104215 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.402 104215 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.402 104215 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.402 104215 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.403 104215 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.403 104215 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.403 104215 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.403 104215 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.403 104215 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.404 104215 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.404 104215 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.404 104215 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.404 104215 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.404 104215 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.404 104215 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.404 104215 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.405 104215 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.405 104215 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.405 104215 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.405 104215 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.405 104215 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.405 104215 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.406 104215 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.406 104215 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.406 104215 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.406 104215 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.406 104215 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.406 104215 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.406 104215 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.407 104215 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.407 104215 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.407 104215 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.407 104215 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.407 104215 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.407 104215 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.408 104215 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.408 104215 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.408 104215 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.408 104215 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.408 104215 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.408 104215 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.409 104215 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.409 104215 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.409 104215 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.409 104215 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.409 104215 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.409 104215 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.409 104215 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.410 104215 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.410 104215 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.410 104215 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.410 104215 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.410 104215 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.410 104215 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.411 104215 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.411 104215 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.411 104215 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.411 104215 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.411 104215 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.412 104215 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.412 104215 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.412 104215 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.412 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.412 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.412 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.413 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.413 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.413 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.413 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.413 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.413 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.413 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.414 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.414 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.414 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.414 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.414 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.414 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.414 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.415 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.415 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.415 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.415 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.415 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.415 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.415 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.416 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.416 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.416 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.416 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.416 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.416 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.416 104215 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.417 104215 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.417 104215 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.417 104215 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.417 104215 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:01:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:01:14.417 104215 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 22:01:14 compute-0 sshd-session[95878]: Connection closed by 192.168.122.30 port 37962
Jan 22 22:01:14 compute-0 sshd-session[95875]: pam_unix(sshd:session): session closed for user zuul
Jan 22 22:01:14 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Jan 22 22:01:14 compute-0 systemd[1]: session-21.scope: Consumed 56.693s CPU time.
Jan 22 22:01:14 compute-0 systemd-logind[801]: Session 21 logged out. Waiting for processes to exit.
Jan 22 22:01:14 compute-0 systemd-logind[801]: Removed session 21.
Jan 22 22:01:21 compute-0 sshd-session[104760]: Accepted publickey for zuul from 192.168.122.30 port 52542 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 22:01:21 compute-0 systemd-logind[801]: New session 22 of user zuul.
Jan 22 22:01:21 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 22 22:01:21 compute-0 sshd-session[104760]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 22:01:23 compute-0 python3.9[104913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 22:01:25 compute-0 sudo[105091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moxgzpllvaoqijtkeswejxryjhhwxani ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119285.0651197-62-69094580242126/AnsiballZ_command.py'
Jan 22 22:01:25 compute-0 sudo[105091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:25 compute-0 python3.9[105093]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:01:25 compute-0 sudo[105091]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:29 compute-0 sudo[105256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbwqpuwckkuutjrtjthctwyxwqzgbakt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119288.8075025-95-162622579525103/AnsiballZ_systemd_service.py'
Jan 22 22:01:29 compute-0 sudo[105256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:29 compute-0 python3.9[105258]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:01:29 compute-0 systemd[1]: Reloading.
Jan 22 22:01:29 compute-0 systemd-sysv-generator[105289]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:01:29 compute-0 systemd-rc-local-generator[105286]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:01:30 compute-0 sudo[105256]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:31 compute-0 python3.9[105443]: ansible-ansible.builtin.service_facts Invoked
Jan 22 22:01:31 compute-0 network[105460]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 22:01:31 compute-0 network[105461]: 'network-scripts' will be removed from distribution in near future.
Jan 22 22:01:31 compute-0 network[105462]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 22:01:34 compute-0 podman[105502]: 2026-01-22 22:01:34.542217021 +0000 UTC m=+0.144807718 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 22:01:38 compute-0 sudo[105748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zghabowwchspsroxgrtonycplyanhprl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119298.0370133-152-66174089739830/AnsiballZ_systemd_service.py'
Jan 22 22:01:38 compute-0 sudo[105748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:38 compute-0 python3.9[105750]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:38 compute-0 sudo[105748]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:39 compute-0 sudo[105901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjdtjcjuddkxcraxtoknzrkziypzzcmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119299.1416535-152-252015699536288/AnsiballZ_systemd_service.py'
Jan 22 22:01:39 compute-0 sudo[105901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:39 compute-0 python3.9[105903]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:39 compute-0 sudo[105901]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:40 compute-0 sudo[106054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gizfugohstyslwgvyfgrdxgwobbfnisb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119300.034939-152-197490991391391/AnsiballZ_systemd_service.py'
Jan 22 22:01:40 compute-0 sudo[106054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:40 compute-0 python3.9[106056]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:40 compute-0 sudo[106054]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:40 compute-0 podman[106058]: 2026-01-22 22:01:40.831611043 +0000 UTC m=+0.078394056 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:01:41 compute-0 sudo[106226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvzwkvndovtfllqoymvnnbqfdknnzluf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119300.8971434-152-44655530832570/AnsiballZ_systemd_service.py'
Jan 22 22:01:41 compute-0 sudo[106226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:41 compute-0 python3.9[106228]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:41 compute-0 sudo[106226]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:42 compute-0 sudo[106379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrhxhphypymxxxwqniolhueejqfyepqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119301.7612562-152-145694896247435/AnsiballZ_systemd_service.py'
Jan 22 22:01:42 compute-0 sudo[106379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:42 compute-0 python3.9[106381]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:42 compute-0 sudo[106379]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:42 compute-0 sudo[106532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tapntdolsyftehfgpxrgfdceckydfvgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119302.6131554-152-22121282022511/AnsiballZ_systemd_service.py'
Jan 22 22:01:42 compute-0 sudo[106532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:43 compute-0 python3.9[106534]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:43 compute-0 sudo[106532]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:43 compute-0 sudo[106685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlfhidhhmapvspvlrxzclutfnqygocbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119303.5122955-152-148295430674419/AnsiballZ_systemd_service.py'
Jan 22 22:01:43 compute-0 sudo[106685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:44 compute-0 python3.9[106687]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:01:44 compute-0 sudo[106685]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:45 compute-0 sudo[106838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veyftavauggxastemkqrzzcvweforxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119304.7368078-308-274911184671790/AnsiballZ_file.py'
Jan 22 22:01:45 compute-0 sudo[106838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:45 compute-0 python3.9[106840]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:45 compute-0 sudo[106838]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:46 compute-0 sudo[106990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxssrtvpodksvfhhziezdnunwtfszsui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119305.681581-308-250690886628825/AnsiballZ_file.py'
Jan 22 22:01:46 compute-0 sudo[106990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:46 compute-0 python3.9[106992]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:46 compute-0 sudo[106990]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:46 compute-0 sudo[107142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkdnttouseqkeodxgsjgtkcgyicwxcfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119306.4045973-308-250693450686398/AnsiballZ_file.py'
Jan 22 22:01:46 compute-0 sudo[107142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:46 compute-0 python3.9[107144]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:46 compute-0 sudo[107142]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:47 compute-0 sudo[107294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqbqyykngjgcksqcugaivgzkbdryubtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119307.1662955-308-20233203085464/AnsiballZ_file.py'
Jan 22 22:01:47 compute-0 sudo[107294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:47 compute-0 python3.9[107296]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:47 compute-0 sudo[107294]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:48 compute-0 sudo[107446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtwuebumnvydhnsnecukfhortptlgref ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119308.0874794-308-57285816938921/AnsiballZ_file.py'
Jan 22 22:01:48 compute-0 sudo[107446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:48 compute-0 python3.9[107448]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:48 compute-0 sudo[107446]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:49 compute-0 sudo[107598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkvvkhtwltmfpaekkjwjrsbthzrorxqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119308.95939-308-232413715513572/AnsiballZ_file.py'
Jan 22 22:01:49 compute-0 sudo[107598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:49 compute-0 python3.9[107600]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:49 compute-0 sudo[107598]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:50 compute-0 sudo[107750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njhsadljiwvwcfeynuhdfuksacuozufy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119309.6680424-308-185557711988295/AnsiballZ_file.py'
Jan 22 22:01:50 compute-0 sudo[107750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:50 compute-0 python3.9[107752]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:50 compute-0 sudo[107750]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:50 compute-0 sudo[107902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umzwkennsbvvvaqivkzjcgvtnewcwefn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119310.447055-458-131553126026702/AnsiballZ_file.py'
Jan 22 22:01:50 compute-0 sudo[107902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:51 compute-0 python3.9[107904]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:51 compute-0 sudo[107902]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:51 compute-0 sudo[108054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdpxzhawuzycuwkolefueeawvhxmsnxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119311.1902351-458-131508169159344/AnsiballZ_file.py'
Jan 22 22:01:51 compute-0 sudo[108054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:51 compute-0 python3.9[108056]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:51 compute-0 sudo[108054]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:52 compute-0 sudo[108206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfoaynrccwxygxnjuoalemzgyxlswyis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119311.8856473-458-14576525170977/AnsiballZ_file.py'
Jan 22 22:01:52 compute-0 sudo[108206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:52 compute-0 python3.9[108208]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:52 compute-0 sudo[108206]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:52 compute-0 sudo[108358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gezkpglpwvtwansejqzpmuvffodndibt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119312.597856-458-81555247381439/AnsiballZ_file.py'
Jan 22 22:01:52 compute-0 sudo[108358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:53 compute-0 python3.9[108360]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:53 compute-0 sudo[108358]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:53 compute-0 sudo[108510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijatmsciftkonwgrgteuoznfcpzhmgil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119313.3878667-458-59248689255292/AnsiballZ_file.py'
Jan 22 22:01:53 compute-0 sudo[108510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:53 compute-0 python3.9[108512]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:53 compute-0 sudo[108510]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:54 compute-0 sudo[108662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exwmwwnxuanjqlmfddplmpnvbnkmihxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119314.1401255-458-133517589301881/AnsiballZ_file.py'
Jan 22 22:01:54 compute-0 sudo[108662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:54 compute-0 python3.9[108664]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:54 compute-0 sudo[108662]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:55 compute-0 sudo[108814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmdcxgncgjpgoesnhrelqawwpepczmvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119314.7943337-458-150960471822739/AnsiballZ_file.py'
Jan 22 22:01:55 compute-0 sudo[108814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:55 compute-0 python3.9[108816]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:01:55 compute-0 sudo[108814]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:56 compute-0 sudo[108966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yektigmjqbefntsvowfvwxttvfcaxbhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119315.6764607-611-196194926830664/AnsiballZ_command.py'
Jan 22 22:01:56 compute-0 sudo[108966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:56 compute-0 python3.9[108968]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:01:56 compute-0 sudo[108966]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:57 compute-0 python3.9[109121]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 22:01:57 compute-0 sudo[109271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anupmjvrywwjggajcqhqfspnpampujyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119317.5273995-665-75645000154684/AnsiballZ_systemd_service.py'
Jan 22 22:01:57 compute-0 sudo[109271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:58 compute-0 python3.9[109273]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:01:58 compute-0 systemd[1]: Reloading.
Jan 22 22:01:58 compute-0 systemd-rc-local-generator[109302]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:01:58 compute-0 systemd-sysv-generator[109305]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:01:58 compute-0 sudo[109271]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:59 compute-0 sudo[109459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sebbesrnldfgslnpnprsqtonkqlmuxik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119318.7256894-689-212535370181533/AnsiballZ_command.py'
Jan 22 22:01:59 compute-0 sudo[109459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:01:59 compute-0 python3.9[109461]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:01:59 compute-0 sudo[109459]: pam_unix(sudo:session): session closed for user root
Jan 22 22:01:59 compute-0 sudo[109612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giprdmbyaxibcpwsptevozymootzqkbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119319.4969926-689-189440510276736/AnsiballZ_command.py'
Jan 22 22:01:59 compute-0 sudo[109612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:00 compute-0 python3.9[109614]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:02:00 compute-0 sudo[109612]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:00 compute-0 sudo[109765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhiyvjjlmbpvfkyzytiqfnrhulkkhhbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119320.2735054-689-40661964609240/AnsiballZ_command.py'
Jan 22 22:02:00 compute-0 sudo[109765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:00 compute-0 python3.9[109767]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:02:00 compute-0 sudo[109765]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:01 compute-0 sudo[109918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qozdddqruxxcitjbawditamiozqsyxgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119321.0110688-689-195173044140155/AnsiballZ_command.py'
Jan 22 22:02:01 compute-0 sudo[109918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:01 compute-0 python3.9[109920]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:02:01 compute-0 sudo[109918]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:02 compute-0 sudo[110071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibpucfkyaydefuwlrobavecgamkjjgks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119321.700769-689-206659152380587/AnsiballZ_command.py'
Jan 22 22:02:02 compute-0 sudo[110071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:02 compute-0 python3.9[110073]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:02:02 compute-0 sudo[110071]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:02 compute-0 sudo[110224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpjyrkzllqqofunkncqezdnaqvuqmwvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119322.3973262-689-155270480455020/AnsiballZ_command.py'
Jan 22 22:02:02 compute-0 sudo[110224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:02 compute-0 python3.9[110226]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:02:02 compute-0 sudo[110224]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:03 compute-0 sudo[110377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gscfxvadfltvikycnertpmxkrswhwlmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119323.0720925-689-230850750976619/AnsiballZ_command.py'
Jan 22 22:02:03 compute-0 sudo[110377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:03 compute-0 python3.9[110379]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:02:03 compute-0 sudo[110377]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:04 compute-0 sudo[110547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gknoxqxeowtkqhrjpqdopokptgkvenhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119324.1824682-851-194617857117131/AnsiballZ_getent.py'
Jan 22 22:02:04 compute-0 sudo[110547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:04 compute-0 podman[110504]: 2026-01-22 22:02:04.777899698 +0000 UTC m=+0.115458183 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:02:04 compute-0 python3.9[110553]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 22 22:02:04 compute-0 sudo[110547]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:05 compute-0 sudo[110709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwdwkolzfjxhparsplarpfnoafhbnwoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119325.154739-875-14482380739140/AnsiballZ_group.py'
Jan 22 22:02:05 compute-0 sudo[110709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:05 compute-0 python3.9[110711]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 22:02:05 compute-0 groupadd[110712]: group added to /etc/group: name=libvirt, GID=42473
Jan 22 22:02:05 compute-0 groupadd[110712]: group added to /etc/gshadow: name=libvirt
Jan 22 22:02:05 compute-0 groupadd[110712]: new group: name=libvirt, GID=42473
Jan 22 22:02:05 compute-0 sudo[110709]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:06 compute-0 sudo[110867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhngbuanqbkhszrbkjgqqteamsiapiph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119326.212141-899-48901627467507/AnsiballZ_user.py'
Jan 22 22:02:06 compute-0 sudo[110867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:06 compute-0 python3.9[110869]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 22:02:07 compute-0 useradd[110871]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 22:02:07 compute-0 sudo[110867]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:07 compute-0 sudo[111027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soggunaefsipyozfmhfgzhagvovtxjmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119327.5223527-932-258708458419217/AnsiballZ_setup.py'
Jan 22 22:02:07 compute-0 sudo[111027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:08 compute-0 python3.9[111029]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 22:02:08 compute-0 sudo[111027]: pam_unix(sudo:session): session closed for user root
Jan 22 22:02:08 compute-0 sudo[111111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbvvxqfbvkakctxnxdosidhyrcbsryqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119327.5223527-932-258708458419217/AnsiballZ_dnf.py'
Jan 22 22:02:08 compute-0 sudo[111111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:02:08 compute-0 python3.9[111113]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 22:02:11 compute-0 podman[111121]: 2026-01-22 22:02:11.1304255 +0000 UTC m=+0.063364824 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 22:02:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:02:12.408 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:02:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:02:12.409 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:02:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:02:12.410 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:02:35 compute-0 podman[111323]: 2026-01-22 22:02:35.214571111 +0000 UTC m=+0.141935755 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:02:42 compute-0 podman[111354]: 2026-01-22 22:02:42.122433553 +0000 UTC m=+0.059004443 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:02:42 compute-0 kernel: SELinux:  Converting 2763 SID table entries...
Jan 22 22:02:42 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 22:02:42 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 22:02:42 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 22:02:42 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 22:02:42 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 22:02:42 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 22:02:42 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 22:02:51 compute-0 kernel: SELinux:  Converting 2763 SID table entries...
Jan 22 22:02:51 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 22:02:51 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 22:02:51 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 22:02:51 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 22:02:51 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 22:02:51 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 22:02:51 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 22:03:06 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 22 22:03:06 compute-0 podman[112422]: 2026-01-22 22:03:06.257872274 +0000 UTC m=+0.162410254 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 22:03:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:03:12.410 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:03:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:03:12.410 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:03:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:03:12.410 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:03:13 compute-0 podman[115877]: 2026-01-22 22:03:13.123258723 +0000 UTC m=+0.055445897 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:03:37 compute-0 podman[128070]: 2026-01-22 22:03:37.152601993 +0000 UTC m=+0.092928727 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:03:44 compute-0 podman[128324]: 2026-01-22 22:03:44.126613785 +0000 UTC m=+0.058239746 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:03:54 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 22 22:03:54 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 22:03:54 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 22 22:03:54 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 22:03:54 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 22 22:03:54 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 22:03:54 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 22:03:54 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 22:03:55 compute-0 groupadd[128356]: group added to /etc/group: name=dnsmasq, GID=993
Jan 22 22:03:55 compute-0 groupadd[128356]: group added to /etc/gshadow: name=dnsmasq
Jan 22 22:03:55 compute-0 groupadd[128356]: new group: name=dnsmasq, GID=993
Jan 22 22:03:55 compute-0 useradd[128363]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 22 22:03:55 compute-0 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Jan 22 22:03:55 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 22 22:03:55 compute-0 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Jan 22 22:03:56 compute-0 groupadd[128376]: group added to /etc/group: name=clevis, GID=992
Jan 22 22:03:56 compute-0 groupadd[128376]: group added to /etc/gshadow: name=clevis
Jan 22 22:03:56 compute-0 groupadd[128376]: new group: name=clevis, GID=992
Jan 22 22:03:56 compute-0 useradd[128383]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 22 22:03:56 compute-0 usermod[128393]: add 'clevis' to group 'tss'
Jan 22 22:03:56 compute-0 usermod[128393]: add 'clevis' to shadow group 'tss'
Jan 22 22:03:58 compute-0 polkitd[43395]: Reloading rules
Jan 22 22:03:58 compute-0 polkitd[43395]: Collecting garbage unconditionally...
Jan 22 22:03:58 compute-0 polkitd[43395]: Loading rules from directory /etc/polkit-1/rules.d
Jan 22 22:03:58 compute-0 polkitd[43395]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 22 22:03:58 compute-0 polkitd[43395]: Finished loading, compiling and executing 3 rules
Jan 22 22:03:58 compute-0 polkitd[43395]: Reloading rules
Jan 22 22:03:58 compute-0 polkitd[43395]: Collecting garbage unconditionally...
Jan 22 22:03:58 compute-0 polkitd[43395]: Loading rules from directory /etc/polkit-1/rules.d
Jan 22 22:03:58 compute-0 polkitd[43395]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 22 22:03:58 compute-0 polkitd[43395]: Finished loading, compiling and executing 3 rules
Jan 22 22:04:00 compute-0 groupadd[128583]: group added to /etc/group: name=ceph, GID=167
Jan 22 22:04:00 compute-0 groupadd[128583]: group added to /etc/gshadow: name=ceph
Jan 22 22:04:00 compute-0 groupadd[128583]: new group: name=ceph, GID=167
Jan 22 22:04:00 compute-0 useradd[128589]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 22 22:04:02 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 22 22:04:02 compute-0 sshd[1009]: Received signal 15; terminating.
Jan 22 22:04:02 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 22 22:04:02 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 22 22:04:02 compute-0 systemd[1]: sshd.service: Consumed 1.645s CPU time, read 32.0K from disk, written 0B to disk.
Jan 22 22:04:02 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 22 22:04:02 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 22 22:04:02 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 22:04:02 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 22:04:02 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 22:04:02 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 22 22:04:02 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 22 22:04:03 compute-0 sshd[129108]: Server listening on 0.0.0.0 port 22.
Jan 22 22:04:03 compute-0 sshd[129108]: Server listening on :: port 22.
Jan 22 22:04:03 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 22 22:04:05 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 22:04:05 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 22:04:05 compute-0 systemd[1]: Reloading.
Jan 22 22:04:05 compute-0 systemd-rc-local-generator[129368]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:05 compute-0 systemd-sysv-generator[129372]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:05 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 22:04:08 compute-0 podman[131732]: 2026-01-22 22:04:08.191251649 +0000 UTC m=+0.111922041 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 22:04:09 compute-0 sudo[111111]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:10 compute-0 sudo[134175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flmikviuwtgsekpkmagskddnwnaamlpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119449.6202464-968-78495418056578/AnsiballZ_systemd.py'
Jan 22 22:04:10 compute-0 sudo[134175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:10 compute-0 python3.9[134201]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 22:04:10 compute-0 systemd[1]: Reloading.
Jan 22 22:04:10 compute-0 systemd-rc-local-generator[134677]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:10 compute-0 systemd-sysv-generator[134681]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:10 compute-0 sudo[134175]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:11 compute-0 sudo[135678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krloatvbykzvaqaxiybuiahjrzzojjyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119451.0669327-968-31526205941131/AnsiballZ_systemd.py'
Jan 22 22:04:11 compute-0 sudo[135678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:11 compute-0 python3.9[135702]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 22:04:11 compute-0 systemd[1]: Reloading.
Jan 22 22:04:11 compute-0 systemd-rc-local-generator[136084]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:11 compute-0 systemd-sysv-generator[136089]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:12 compute-0 sudo[135678]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:12 compute-0 sudo[136855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovifltdpxqrhxgcklioxfuwqdngpdlgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119452.127453-968-61609005920970/AnsiballZ_systemd.py'
Jan 22 22:04:12 compute-0 sudo[136855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:04:12.411 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:04:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:04:12.412 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:04:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:04:12.412 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:04:12 compute-0 python3.9[136877]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 22:04:12 compute-0 systemd[1]: Reloading.
Jan 22 22:04:12 compute-0 systemd-sysv-generator[137342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:12 compute-0 systemd-rc-local-generator[137338]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:13 compute-0 sudo[136855]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:13 compute-0 sudo[138047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbdrkcddqbzxiqildzkmpgqqrixdoprc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119453.1582026-968-234563995877985/AnsiballZ_systemd.py'
Jan 22 22:04:13 compute-0 sudo[138047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:13 compute-0 python3.9[138070]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 22:04:13 compute-0 systemd[1]: Reloading.
Jan 22 22:04:13 compute-0 systemd-rc-local-generator[138402]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:13 compute-0 systemd-sysv-generator[138405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:14 compute-0 sudo[138047]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:14 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 22:04:14 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 22:04:14 compute-0 systemd[1]: man-db-cache-update.service: Consumed 10.754s CPU time.
Jan 22 22:04:14 compute-0 systemd[1]: run-rdecfdafaa6f94e7c962ccd0f2130f645.service: Deactivated successfully.
Jan 22 22:04:15 compute-0 podman[138552]: 2026-01-22 22:04:15.155494503 +0000 UTC m=+0.086774893 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:04:15 compute-0 sudo[138696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgmkcwtqrdukkdwtacffpsftyjgesicz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119455.234506-1055-59264236970686/AnsiballZ_systemd.py'
Jan 22 22:04:15 compute-0 sudo[138696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:15 compute-0 python3.9[138698]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:16 compute-0 systemd[1]: Reloading.
Jan 22 22:04:16 compute-0 systemd-rc-local-generator[138724]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:16 compute-0 systemd-sysv-generator[138728]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:16 compute-0 sudo[138696]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:16 compute-0 sudo[138886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmosrapdzduyebdoktuqsljxbhpxahhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119456.4683764-1055-256621222867643/AnsiballZ_systemd.py'
Jan 22 22:04:16 compute-0 sudo[138886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:17 compute-0 python3.9[138888]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:17 compute-0 systemd[1]: Reloading.
Jan 22 22:04:17 compute-0 systemd-rc-local-generator[138920]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:17 compute-0 systemd-sysv-generator[138925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:17 compute-0 sudo[138886]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:18 compute-0 sudo[139076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkwdnpniwjnniuuqkseudvaozpglozpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119457.6870267-1055-190512541205806/AnsiballZ_systemd.py'
Jan 22 22:04:18 compute-0 sudo[139076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:18 compute-0 python3.9[139078]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:18 compute-0 systemd[1]: Reloading.
Jan 22 22:04:18 compute-0 systemd-rc-local-generator[139109]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:18 compute-0 systemd-sysv-generator[139113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:18 compute-0 sudo[139076]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:19 compute-0 sudo[139266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiqzcnvehnasrimhoefndjzgqjijctiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119459.1629653-1055-215517754259517/AnsiballZ_systemd.py'
Jan 22 22:04:19 compute-0 sudo[139266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:19 compute-0 python3.9[139268]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:19 compute-0 sudo[139266]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:20 compute-0 sudo[139421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqbnfmedaczortvxxcsjskmnapbhgoyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119460.0160341-1055-132728678608070/AnsiballZ_systemd.py'
Jan 22 22:04:20 compute-0 sudo[139421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:20 compute-0 python3.9[139423]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:20 compute-0 systemd[1]: Reloading.
Jan 22 22:04:20 compute-0 systemd-rc-local-generator[139448]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:20 compute-0 systemd-sysv-generator[139453]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:20 compute-0 sudo[139421]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:21 compute-0 sudo[139611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfxdezbqjpafejpqxbvgcaoedcyfkgke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119461.6281724-1163-164439085329832/AnsiballZ_systemd.py'
Jan 22 22:04:21 compute-0 sudo[139611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:22 compute-0 python3.9[139613]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 22:04:22 compute-0 systemd[1]: Reloading.
Jan 22 22:04:22 compute-0 systemd-rc-local-generator[139641]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:04:22 compute-0 systemd-sysv-generator[139646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:04:22 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 22 22:04:22 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 22 22:04:22 compute-0 sudo[139611]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:23 compute-0 sudo[139805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nafzcitwtyraprqgwtbzdbkektqumltx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119462.99338-1187-237366836587083/AnsiballZ_systemd.py'
Jan 22 22:04:23 compute-0 sudo[139805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:23 compute-0 python3.9[139807]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:23 compute-0 sudo[139805]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:24 compute-0 sudo[139960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qchfnhpsiwypiijpsxphztsjgknarcur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119464.2282453-1187-227783586189430/AnsiballZ_systemd.py'
Jan 22 22:04:24 compute-0 sudo[139960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:24 compute-0 python3.9[139962]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:25 compute-0 sudo[139960]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:26 compute-0 sudo[140115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqhwjehxxjfarsfvezsbghdryemoxmzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119465.1833506-1187-8970030698853/AnsiballZ_systemd.py'
Jan 22 22:04:26 compute-0 sudo[140115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:26 compute-0 python3.9[140117]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:26 compute-0 sudo[140115]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:27 compute-0 sudo[140270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mejulvjxpskeiaahohqynghuffzkvttz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119466.918434-1187-41361931644667/AnsiballZ_systemd.py'
Jan 22 22:04:27 compute-0 sudo[140270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:27 compute-0 python3.9[140272]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:27 compute-0 sudo[140270]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:28 compute-0 sudo[140425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bynsuvrdfmimrzhohfmffojcgaxtiibx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119467.9059129-1187-109781775430858/AnsiballZ_systemd.py'
Jan 22 22:04:28 compute-0 sudo[140425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:28 compute-0 python3.9[140427]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:28 compute-0 sudo[140425]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:29 compute-0 sudo[140580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzmydsytpmqchcgrlorhtnkdoeblhoeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119468.8032002-1187-2047950743369/AnsiballZ_systemd.py'
Jan 22 22:04:29 compute-0 sudo[140580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:29 compute-0 python3.9[140582]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:29 compute-0 sudo[140580]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:30 compute-0 sudo[140735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmutnhuwncgicdnresisktnkqsuwmfhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119469.687714-1187-106822438555240/AnsiballZ_systemd.py'
Jan 22 22:04:30 compute-0 sudo[140735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:30 compute-0 python3.9[140737]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:30 compute-0 sudo[140735]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:31 compute-0 sudo[140890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwywyanzcbndubhveuqeuutfizaqvbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119470.6391606-1187-242802693322505/AnsiballZ_systemd.py'
Jan 22 22:04:31 compute-0 sudo[140890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:31 compute-0 python3.9[140892]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:31 compute-0 sudo[140890]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:31 compute-0 sudo[141045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isyiyxcxeqiophrgfxhebklacfhwmsvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119471.5941796-1187-150302909928914/AnsiballZ_systemd.py'
Jan 22 22:04:31 compute-0 sudo[141045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:32 compute-0 python3.9[141047]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:32 compute-0 sudo[141045]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:32 compute-0 sudo[141200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drqbpqpfmoxuyjqtymvvrlnjyzvycioe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119472.495054-1187-132048481028026/AnsiballZ_systemd.py'
Jan 22 22:04:32 compute-0 sudo[141200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:33 compute-0 python3.9[141202]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:33 compute-0 sudo[141200]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:33 compute-0 sudo[141355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyqxddbghgrrfdtzfjhivtuthibhjcrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119473.5309148-1187-172652091771428/AnsiballZ_systemd.py'
Jan 22 22:04:33 compute-0 sudo[141355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:34 compute-0 python3.9[141357]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:35 compute-0 sudo[141355]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:36 compute-0 sudo[141510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcalemecfydncyeqprwfvrcsykzvioww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119475.6182263-1187-82152875293956/AnsiballZ_systemd.py'
Jan 22 22:04:36 compute-0 sudo[141510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:36 compute-0 python3.9[141512]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:36 compute-0 sudo[141510]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:36 compute-0 sudo[141665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-larjjuedxmdosbdfvdvzknrjvyguuoch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119476.614584-1187-1841852944028/AnsiballZ_systemd.py'
Jan 22 22:04:36 compute-0 sudo[141665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:37 compute-0 python3.9[141667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:37 compute-0 sudo[141665]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:37 compute-0 sudo[141820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzuoqvgagzixftmozbcnlgdvgyzzkglp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119477.5305157-1187-84032968222686/AnsiballZ_systemd.py'
Jan 22 22:04:37 compute-0 sudo[141820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:38 compute-0 python3.9[141822]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 22:04:38 compute-0 sudo[141820]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:38 compute-0 podman[141824]: 2026-01-22 22:04:38.427846331 +0000 UTC m=+0.128908093 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:04:39 compute-0 sudo[142001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbnukaydbgimsympkzzmmwooveuvkfli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119479.0080266-1493-31283197709812/AnsiballZ_file.py'
Jan 22 22:04:39 compute-0 sudo[142001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:39 compute-0 python3.9[142003]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:04:39 compute-0 sudo[142001]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:40 compute-0 sudo[142153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbclaplbqvazaawanqvzmaorruxgmhsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119479.934131-1493-78086471417917/AnsiballZ_file.py'
Jan 22 22:04:40 compute-0 sudo[142153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:40 compute-0 python3.9[142155]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:04:40 compute-0 sudo[142153]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:41 compute-0 sudo[142305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvintdjshtsvgizfvtenkptrilgbfasd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119480.7409804-1493-141928504715931/AnsiballZ_file.py'
Jan 22 22:04:41 compute-0 sudo[142305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:41 compute-0 python3.9[142307]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:04:41 compute-0 sudo[142305]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:41 compute-0 sudo[142457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrtempgkynrpwrgvbrcimbfcryskjkbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119481.4072704-1493-73302209610148/AnsiballZ_file.py'
Jan 22 22:04:41 compute-0 sudo[142457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:41 compute-0 python3.9[142459]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:04:41 compute-0 sudo[142457]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:42 compute-0 sudo[142609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrtbpdhqgtaswhpcavzpiyhxazioslux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119482.0409908-1493-106740299637250/AnsiballZ_file.py'
Jan 22 22:04:42 compute-0 sudo[142609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:42 compute-0 python3.9[142611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:04:42 compute-0 sudo[142609]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:42 compute-0 sudo[142761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckgmhacgnpcuhxkkingfqezhrzhefnnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119482.6423979-1493-204623464306232/AnsiballZ_file.py'
Jan 22 22:04:42 compute-0 sudo[142761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:43 compute-0 python3.9[142763]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:04:43 compute-0 sudo[142761]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:43 compute-0 python3.9[142913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 22:04:44 compute-0 sudo[143063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbwjszjzhvqzyjrjtfvjnbcpmmcqyoev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119484.160145-1646-44834206172173/AnsiballZ_stat.py'
Jan 22 22:04:44 compute-0 sudo[143063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:44 compute-0 python3.9[143065]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:44 compute-0 sudo[143063]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:45 compute-0 sudo[143197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfmdoclexigczbabnmyivzzxdatathdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119484.160145-1646-44834206172173/AnsiballZ_copy.py'
Jan 22 22:04:45 compute-0 sudo[143197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:45 compute-0 podman[143162]: 2026-01-22 22:04:45.449145186 +0000 UTC m=+0.085626405 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:04:45 compute-0 python3.9[143209]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119484.160145-1646-44834206172173/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:45 compute-0 sudo[143197]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:46 compute-0 sudo[143359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtmzzkdjxvaznmqxkoswbczykgludqmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119485.793963-1646-167017246578209/AnsiballZ_stat.py'
Jan 22 22:04:46 compute-0 sudo[143359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:46 compute-0 python3.9[143361]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:46 compute-0 sudo[143359]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:47 compute-0 sudo[143484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yokgesapfqucfymxiliddzzxbvhkcmoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119485.793963-1646-167017246578209/AnsiballZ_copy.py'
Jan 22 22:04:47 compute-0 sudo[143484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:47 compute-0 python3.9[143486]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119485.793963-1646-167017246578209/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:47 compute-0 sudo[143484]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:48 compute-0 sudo[143636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjoarrvdtpqxtmwwwxmtxatwrbvvmwvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119487.806583-1646-222632563734388/AnsiballZ_stat.py'
Jan 22 22:04:48 compute-0 sudo[143636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:48 compute-0 python3.9[143638]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:48 compute-0 sudo[143636]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:48 compute-0 sudo[143761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxbfopqssdilvcutiyhovefmadaestit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119487.806583-1646-222632563734388/AnsiballZ_copy.py'
Jan 22 22:04:48 compute-0 sudo[143761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:49 compute-0 python3.9[143763]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119487.806583-1646-222632563734388/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:49 compute-0 sudo[143761]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:49 compute-0 sudo[143913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szvxtzcvfplsztalwsysdgvrjcxztgjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119489.2297704-1646-140669244937106/AnsiballZ_stat.py'
Jan 22 22:04:49 compute-0 sudo[143913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:49 compute-0 python3.9[143915]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:49 compute-0 sudo[143913]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:50 compute-0 sudo[144038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdahnnybthkurivdvjtimtzxylpndbba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119489.2297704-1646-140669244937106/AnsiballZ_copy.py'
Jan 22 22:04:50 compute-0 sudo[144038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:50 compute-0 python3.9[144040]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119489.2297704-1646-140669244937106/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:50 compute-0 sudo[144038]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:50 compute-0 sudo[144190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgwqspgzpowwwolutxcvzkjnmmicwqyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119490.611465-1646-7043019026754/AnsiballZ_stat.py'
Jan 22 22:04:50 compute-0 sudo[144190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:51 compute-0 python3.9[144192]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:51 compute-0 sudo[144190]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:51 compute-0 sudo[144315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxwpvfylpovgzjwtyqosibtqqynzjvrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119490.611465-1646-7043019026754/AnsiballZ_copy.py'
Jan 22 22:04:51 compute-0 sudo[144315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:51 compute-0 python3.9[144317]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119490.611465-1646-7043019026754/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:51 compute-0 sudo[144315]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:52 compute-0 sudo[144467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xminwmckxdtomzmfqilrcgbvglrlkhqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119492.020515-1646-42962422587501/AnsiballZ_stat.py'
Jan 22 22:04:52 compute-0 sudo[144467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:52 compute-0 python3.9[144469]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:52 compute-0 sudo[144467]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:53 compute-0 sudo[144592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxevdnhbxjfbncsedjdypnowbzpakayk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119492.020515-1646-42962422587501/AnsiballZ_copy.py'
Jan 22 22:04:53 compute-0 sudo[144592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:53 compute-0 python3.9[144594]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119492.020515-1646-42962422587501/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:53 compute-0 sudo[144592]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:53 compute-0 sudo[144744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-javkqfyuretahkajgodvzybgmsqjkdpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119493.4766402-1646-239492928644901/AnsiballZ_stat.py'
Jan 22 22:04:53 compute-0 sudo[144744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:54 compute-0 python3.9[144746]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:54 compute-0 sudo[144744]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:54 compute-0 sudo[144867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slqgcivhebgumznddgntpyqwhxbbhexd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119493.4766402-1646-239492928644901/AnsiballZ_copy.py'
Jan 22 22:04:54 compute-0 sudo[144867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:54 compute-0 python3.9[144869]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119493.4766402-1646-239492928644901/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:54 compute-0 sudo[144867]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:55 compute-0 sudo[145019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utdokwsfnqsfbmlfndpjwukmhowsfhes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119494.8571928-1646-157587279034397/AnsiballZ_stat.py'
Jan 22 22:04:55 compute-0 sudo[145019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:55 compute-0 python3.9[145021]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:04:55 compute-0 sudo[145019]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:55 compute-0 sudo[145144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlzlxuppikegcahygwdhaybbivcuwrbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119494.8571928-1646-157587279034397/AnsiballZ_copy.py'
Jan 22 22:04:55 compute-0 sudo[145144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:56 compute-0 python3.9[145146]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119494.8571928-1646-157587279034397/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:56 compute-0 sudo[145144]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:56 compute-0 sudo[145296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxeovooaovueesebtvbypemfuyslitrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119496.3463724-1985-36275824023651/AnsiballZ_command.py'
Jan 22 22:04:56 compute-0 sudo[145296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:56 compute-0 python3.9[145298]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 22 22:04:56 compute-0 sudo[145296]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:57 compute-0 sudo[145449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gngqxtrvmfrvvpindadkzienzmxzjesv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119497.3635054-2012-75623446897756/AnsiballZ_file.py'
Jan 22 22:04:57 compute-0 sudo[145449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:57 compute-0 python3.9[145451]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:57 compute-0 sudo[145449]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:58 compute-0 sudo[145601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxqozzzeydfqpqcwqnuexagydqzhwxao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119498.1391149-2012-54618952718089/AnsiballZ_file.py'
Jan 22 22:04:58 compute-0 sudo[145601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:58 compute-0 python3.9[145603]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:58 compute-0 sudo[145601]: pam_unix(sudo:session): session closed for user root
Jan 22 22:04:59 compute-0 sudo[145753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucyztpdmphcgvlnunsoelwafzitgxutv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119498.9160435-2012-69804338176740/AnsiballZ_file.py'
Jan 22 22:04:59 compute-0 sudo[145753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:04:59 compute-0 python3.9[145755]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:04:59 compute-0 sudo[145753]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:00 compute-0 sudo[145905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbiirymoalypnkyejfohqthwikosdmmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119499.6804655-2012-8390412414212/AnsiballZ_file.py'
Jan 22 22:05:00 compute-0 sudo[145905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:00 compute-0 python3.9[145907]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:00 compute-0 sudo[145905]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:00 compute-0 sudo[146057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pquosuswvgxpprqqnptcgsiwbsmepjxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119500.4101527-2012-198129845097225/AnsiballZ_file.py'
Jan 22 22:05:00 compute-0 sudo[146057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:00 compute-0 python3.9[146059]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:00 compute-0 sudo[146057]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:01 compute-0 sudo[146209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhqwrzgudytiznmopwwjgelkumvuvsvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119501.1524699-2012-202430829304142/AnsiballZ_file.py'
Jan 22 22:05:01 compute-0 sudo[146209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:01 compute-0 python3.9[146211]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:01 compute-0 sudo[146209]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:02 compute-0 sudo[146361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oevpaudkbnolueuubkpdilumngibdybg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119501.8817005-2012-137667398566527/AnsiballZ_file.py'
Jan 22 22:05:02 compute-0 sudo[146361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:02 compute-0 python3.9[146363]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:02 compute-0 sudo[146361]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:03 compute-0 sudo[146513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlarcetvxmrojmjndlvetkpqhjsdvrai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119502.8543005-2012-59014118217572/AnsiballZ_file.py'
Jan 22 22:05:03 compute-0 sudo[146513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:03 compute-0 python3.9[146515]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:03 compute-0 sudo[146513]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:03 compute-0 sudo[146665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uozsvxhyknbbfujdxwickfxljzgibxet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119503.653696-2012-31225540649485/AnsiballZ_file.py'
Jan 22 22:05:03 compute-0 sudo[146665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:04 compute-0 python3.9[146667]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:04 compute-0 sudo[146665]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:04 compute-0 sudo[146817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzydvobomfnoyrqvdqmsewrtiwdgavui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119504.4035306-2012-77154954237422/AnsiballZ_file.py'
Jan 22 22:05:04 compute-0 sudo[146817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:04 compute-0 python3.9[146819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:04 compute-0 sudo[146817]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:05 compute-0 sudo[146969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-depumdfyzgmbwnuyrdyrowkoathvuypc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119505.1534877-2012-191595062808012/AnsiballZ_file.py'
Jan 22 22:05:05 compute-0 sudo[146969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:05 compute-0 python3.9[146971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:05 compute-0 sudo[146969]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:06 compute-0 sudo[147121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmcwjftheqjzlwpyuiixrhcoumnnvumh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119506.0128253-2012-100884415186152/AnsiballZ_file.py'
Jan 22 22:05:06 compute-0 sudo[147121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:06 compute-0 python3.9[147123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:06 compute-0 sudo[147121]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:07 compute-0 sudo[147273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxbhtzdnolwljvghrbldlxvhokppdekj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119506.8642743-2012-915980624839/AnsiballZ_file.py'
Jan 22 22:05:07 compute-0 sudo[147273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:07 compute-0 python3.9[147275]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:07 compute-0 sudo[147273]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:07 compute-0 sudo[147425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbwwwtirxgpycdcdjiqgxbigcamegned ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119507.534195-2012-259631830398219/AnsiballZ_file.py'
Jan 22 22:05:07 compute-0 sudo[147425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:08 compute-0 python3.9[147427]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:08 compute-0 sudo[147425]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:09 compute-0 sudo[147590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfqzdertqgvjakiuvdczxyhhxiswoucm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119508.6770315-2309-246128771782508/AnsiballZ_stat.py'
Jan 22 22:05:09 compute-0 sudo[147590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:09 compute-0 podman[147551]: 2026-01-22 22:05:09.097001042 +0000 UTC m=+0.131565496 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:05:09 compute-0 python3.9[147599]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:09 compute-0 sudo[147590]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:09 compute-0 sudo[147727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opzupkyrmddcfmmxcsvuohghvqmysheg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119508.6770315-2309-246128771782508/AnsiballZ_copy.py'
Jan 22 22:05:09 compute-0 sudo[147727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:09 compute-0 python3.9[147729]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119508.6770315-2309-246128771782508/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:09 compute-0 sudo[147727]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:10 compute-0 sudo[147879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdngxnjkxgfjzwybgkohixfskdwkrkxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119510.1602578-2309-272875333899054/AnsiballZ_stat.py'
Jan 22 22:05:10 compute-0 sudo[147879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:10 compute-0 python3.9[147881]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:10 compute-0 sudo[147879]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:11 compute-0 sudo[148002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogiavvujfozsjooeyggvkoakbnsxltjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119510.1602578-2309-272875333899054/AnsiballZ_copy.py'
Jan 22 22:05:11 compute-0 sudo[148002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:11 compute-0 python3.9[148004]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119510.1602578-2309-272875333899054/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:11 compute-0 sudo[148002]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:11 compute-0 sudo[148154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zythhmvbamogeahdjlxtylwavqjfhjyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119511.5493455-2309-194020910594493/AnsiballZ_stat.py'
Jan 22 22:05:11 compute-0 sudo[148154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:12 compute-0 python3.9[148156]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:12 compute-0 sudo[148154]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:05:12.412 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:05:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:05:12.413 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:05:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:05:12.413 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:05:12 compute-0 sudo[148277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aemdrpxoaoywmdbevrpijrtolpzexibt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119511.5493455-2309-194020910594493/AnsiballZ_copy.py'
Jan 22 22:05:12 compute-0 sudo[148277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:12 compute-0 python3.9[148279]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119511.5493455-2309-194020910594493/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:12 compute-0 sudo[148277]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:13 compute-0 sudo[148429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eibbhipathfrakmhamioebtmngwxmyvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119512.8701313-2309-255483539399609/AnsiballZ_stat.py'
Jan 22 22:05:13 compute-0 sudo[148429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:13 compute-0 python3.9[148431]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:13 compute-0 sudo[148429]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:13 compute-0 sudo[148552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnonvommbzfibudjshaozowyhtndyxdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119512.8701313-2309-255483539399609/AnsiballZ_copy.py'
Jan 22 22:05:13 compute-0 sudo[148552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:13 compute-0 python3.9[148554]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119512.8701313-2309-255483539399609/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:13 compute-0 sudo[148552]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:14 compute-0 sudo[148704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myswngmbbtltsdjsiufcmikhobytkjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119514.1222987-2309-104054053489819/AnsiballZ_stat.py'
Jan 22 22:05:14 compute-0 sudo[148704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:14 compute-0 python3.9[148706]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:14 compute-0 sudo[148704]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:15 compute-0 sudo[148827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajsumtwfukdlsrjjlkzypkzfspnpnjla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119514.1222987-2309-104054053489819/AnsiballZ_copy.py'
Jan 22 22:05:15 compute-0 sudo[148827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:15 compute-0 python3.9[148829]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119514.1222987-2309-104054053489819/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:15 compute-0 sudo[148827]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:15 compute-0 sudo[148988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozkszapbjdjtlkrfixbksyooyqdlubyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119515.5304966-2309-5969742811107/AnsiballZ_stat.py'
Jan 22 22:05:15 compute-0 sudo[148988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:15 compute-0 podman[148953]: 2026-01-22 22:05:15.888778118 +0000 UTC m=+0.072703397 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 22:05:16 compute-0 python3.9[148996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:16 compute-0 sudo[148988]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:16 compute-0 sudo[149121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vygepwbxcwhefipdftipfpqgltztmwaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119515.5304966-2309-5969742811107/AnsiballZ_copy.py'
Jan 22 22:05:16 compute-0 sudo[149121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:16 compute-0 python3.9[149123]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119515.5304966-2309-5969742811107/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:16 compute-0 sudo[149121]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:17 compute-0 sudo[149273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lndaoymvbqliptmmenvupwcapryvghvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119516.8651843-2309-124162895647800/AnsiballZ_stat.py'
Jan 22 22:05:17 compute-0 sudo[149273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:17 compute-0 python3.9[149275]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:17 compute-0 sudo[149273]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:17 compute-0 sudo[149396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkdanjuaakfczbdabmeqthjzwcoxuyov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119516.8651843-2309-124162895647800/AnsiballZ_copy.py'
Jan 22 22:05:17 compute-0 sudo[149396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:17 compute-0 python3.9[149398]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119516.8651843-2309-124162895647800/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:17 compute-0 sudo[149396]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:18 compute-0 sudo[149548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkriqtefdnargpfhhnrursuvrwkkiqrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119518.127548-2309-192853779353829/AnsiballZ_stat.py'
Jan 22 22:05:18 compute-0 sudo[149548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:18 compute-0 python3.9[149550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:18 compute-0 sudo[149548]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:19 compute-0 sudo[149671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlhnuhxcncbjrzpyfwlwylviintwuxvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119518.127548-2309-192853779353829/AnsiballZ_copy.py'
Jan 22 22:05:19 compute-0 sudo[149671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:19 compute-0 python3.9[149673]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119518.127548-2309-192853779353829/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:19 compute-0 sudo[149671]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:19 compute-0 sudo[149823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrhrxoranctpiymuohwftojmeekvlpfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119519.4980013-2309-25534372104668/AnsiballZ_stat.py'
Jan 22 22:05:19 compute-0 sudo[149823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:20 compute-0 python3.9[149825]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:20 compute-0 sudo[149823]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:20 compute-0 sudo[149946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhsqfnajguvuqzkfkciqroixvulnzsdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119519.4980013-2309-25534372104668/AnsiballZ_copy.py'
Jan 22 22:05:20 compute-0 sudo[149946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:20 compute-0 python3.9[149948]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119519.4980013-2309-25534372104668/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:20 compute-0 sudo[149946]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:21 compute-0 sudo[150098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmjtbbucugsxecptfjzwaevwbpdaulfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119520.8610532-2309-86928080909030/AnsiballZ_stat.py'
Jan 22 22:05:21 compute-0 sudo[150098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:21 compute-0 python3.9[150100]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:21 compute-0 sudo[150098]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:21 compute-0 sudo[150221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkzddozmouipkthgdafvaiknslaheilg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119520.8610532-2309-86928080909030/AnsiballZ_copy.py'
Jan 22 22:05:21 compute-0 sudo[150221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:22 compute-0 python3.9[150223]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119520.8610532-2309-86928080909030/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:22 compute-0 sudo[150221]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:22 compute-0 sudo[150373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqqymnjqfozinjocjguhvvyeorpedxgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119522.24513-2309-62458650353785/AnsiballZ_stat.py'
Jan 22 22:05:22 compute-0 sudo[150373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:22 compute-0 python3.9[150375]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:22 compute-0 sudo[150373]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:23 compute-0 sudo[150496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtefxurjkikjcnvlzlwthbcgasnrrqfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119522.24513-2309-62458650353785/AnsiballZ_copy.py'
Jan 22 22:05:23 compute-0 sudo[150496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:23 compute-0 python3.9[150498]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119522.24513-2309-62458650353785/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:23 compute-0 sudo[150496]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:23 compute-0 sudo[150648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uitvzmycqqfoixmuzrgkgcxmgzfwlgoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119523.6188715-2309-119854712034042/AnsiballZ_stat.py'
Jan 22 22:05:23 compute-0 sudo[150648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:24 compute-0 python3.9[150650]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:24 compute-0 sudo[150648]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:24 compute-0 sudo[150771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myigqlmyjenaglzaksxqhedqrzlhrwzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119523.6188715-2309-119854712034042/AnsiballZ_copy.py'
Jan 22 22:05:24 compute-0 sudo[150771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:24 compute-0 python3.9[150773]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119523.6188715-2309-119854712034042/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:24 compute-0 sudo[150771]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:25 compute-0 sudo[150923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqpcwzyhwcslfvmmsbubmbdpnwmtlbhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119524.9381223-2309-49896472262913/AnsiballZ_stat.py'
Jan 22 22:05:25 compute-0 sudo[150923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:25 compute-0 python3.9[150925]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:25 compute-0 sudo[150923]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:25 compute-0 sudo[151046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whdvnsjztdvkxpwseugxceexdrjeclxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119524.9381223-2309-49896472262913/AnsiballZ_copy.py'
Jan 22 22:05:25 compute-0 sudo[151046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:26 compute-0 python3.9[151048]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119524.9381223-2309-49896472262913/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:26 compute-0 sudo[151046]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:26 compute-0 sudo[151198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crshvisprouklwxubyvsfzirjptlqogj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119526.314848-2309-127271177925655/AnsiballZ_stat.py'
Jan 22 22:05:26 compute-0 sudo[151198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:26 compute-0 python3.9[151200]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:26 compute-0 sudo[151198]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:27 compute-0 sudo[151321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-novvyufnmgjtqzjuhrqsjcyhqqatrzmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119526.314848-2309-127271177925655/AnsiballZ_copy.py'
Jan 22 22:05:27 compute-0 sudo[151321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:27 compute-0 python3.9[151323]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119526.314848-2309-127271177925655/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:27 compute-0 sudo[151321]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:28 compute-0 python3.9[151473]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:05:29 compute-0 sudo[151626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcttmpnlmpovayvxmueeysluppousevt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119528.67766-2927-160919046940443/AnsiballZ_seboolean.py'
Jan 22 22:05:29 compute-0 sudo[151626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:29 compute-0 python3.9[151628]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 22 22:05:30 compute-0 sudo[151626]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:31 compute-0 sudo[151782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgneeloptxgebdvjrmcupxbljrpyuxqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119530.9813414-2951-106534725791882/AnsiballZ_copy.py'
Jan 22 22:05:31 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 22 22:05:31 compute-0 sudo[151782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:31 compute-0 python3.9[151784]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:31 compute-0 sudo[151782]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:32 compute-0 sudo[151934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owgsjzclbmiidwpyxglmfreitmdevrim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119531.7112164-2951-74937729087483/AnsiballZ_copy.py'
Jan 22 22:05:32 compute-0 sudo[151934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:32 compute-0 python3.9[151936]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:32 compute-0 sudo[151934]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:32 compute-0 sudo[152086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iktlxfmwlvjqdpwgqjnvqnwjhtuqrpuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119532.4952219-2951-175635167924898/AnsiballZ_copy.py'
Jan 22 22:05:32 compute-0 sudo[152086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:32 compute-0 python3.9[152088]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:33 compute-0 sudo[152086]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:33 compute-0 sudo[152238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnhrsvncmglwetjonhbcaxsdlmwsdriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119533.1985357-2951-2499535317076/AnsiballZ_copy.py'
Jan 22 22:05:33 compute-0 sudo[152238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:33 compute-0 python3.9[152240]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:33 compute-0 sudo[152238]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:34 compute-0 sudo[152390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvjdkpzkafvfspdmaypunazposlqpppz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119534.0285535-2951-210632768890880/AnsiballZ_copy.py'
Jan 22 22:05:34 compute-0 sudo[152390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:34 compute-0 python3.9[152392]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:34 compute-0 sudo[152390]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:35 compute-0 sudo[152542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnijdyaxysczgouuxfgvtuxtslobhjdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119534.940186-3059-74492157456549/AnsiballZ_copy.py'
Jan 22 22:05:35 compute-0 sudo[152542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:35 compute-0 python3.9[152544]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:35 compute-0 sudo[152542]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:36 compute-0 sudo[152694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fggdczlsfiivxncddndtoievjyeltsod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119535.7143898-3059-115959506439103/AnsiballZ_copy.py'
Jan 22 22:05:36 compute-0 sudo[152694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:36 compute-0 python3.9[152696]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:36 compute-0 sudo[152694]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:36 compute-0 sudo[152846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwafuoegmdbllnlxndptckcylpprtgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119536.385366-3059-34656305544388/AnsiballZ_copy.py'
Jan 22 22:05:36 compute-0 sudo[152846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:36 compute-0 python3.9[152848]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:36 compute-0 sudo[152846]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:37 compute-0 sudo[152998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moyscrilhnsmkmhbnwuqmxbwbddnutlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119537.0979073-3059-145697918002237/AnsiballZ_copy.py'
Jan 22 22:05:37 compute-0 sudo[152998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:37 compute-0 python3.9[153000]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:37 compute-0 sudo[152998]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:38 compute-0 sudo[153150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjpzrwrolgqjggdhxhvfefiuztwnggmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119537.8761587-3059-210663445610753/AnsiballZ_copy.py'
Jan 22 22:05:38 compute-0 sudo[153150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:38 compute-0 python3.9[153152]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:38 compute-0 sudo[153150]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:39 compute-0 sudo[153317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apkwaancrwuvcfctvxukwsrvxfeiwocq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119539.1672027-3167-87627907108058/AnsiballZ_systemd.py'
Jan 22 22:05:39 compute-0 sudo[153317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:39 compute-0 podman[153276]: 2026-01-22 22:05:39.597850117 +0000 UTC m=+0.108928791 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:05:39 compute-0 python3.9[153325]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:05:39 compute-0 systemd[1]: Reloading.
Jan 22 22:05:39 compute-0 systemd-sysv-generator[153365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:05:39 compute-0 systemd-rc-local-generator[153362]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:05:40 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 22 22:05:40 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 22 22:05:40 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 22 22:05:40 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 22 22:05:40 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 22 22:05:40 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 22 22:05:40 compute-0 sudo[153317]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:40 compute-0 sudo[153525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyfplwumvqkfrculewfmqhvoqrjlvmbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119540.4631844-3167-162120591059700/AnsiballZ_systemd.py'
Jan 22 22:05:40 compute-0 sudo[153525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:41 compute-0 python3.9[153527]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:05:41 compute-0 systemd[1]: Reloading.
Jan 22 22:05:41 compute-0 systemd-rc-local-generator[153556]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:05:41 compute-0 systemd-sysv-generator[153559]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:05:41 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 22 22:05:41 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 22 22:05:41 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 22 22:05:41 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 22 22:05:41 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 22 22:05:41 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 22 22:05:41 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 22:05:41 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 22 22:05:41 compute-0 sudo[153525]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:42 compute-0 sudo[153742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgmhhszwlvwkzcvgmvhkgpvnuntoxhfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119541.7397194-3167-23780967742246/AnsiballZ_systemd.py'
Jan 22 22:05:42 compute-0 sudo[153742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:42 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 22 22:05:42 compute-0 python3.9[153744]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:05:42 compute-0 systemd[1]: Reloading.
Jan 22 22:05:42 compute-0 systemd-sysv-generator[153776]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:05:42 compute-0 systemd-rc-local-generator[153773]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:05:42 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 22 22:05:42 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 22 22:05:42 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 22 22:05:42 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 22 22:05:42 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 22 22:05:42 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 22 22:05:42 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 22 22:05:42 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 22 22:05:42 compute-0 sudo[153742]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:42 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 22 22:05:43 compute-0 sudo[153963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-armavztlttzmqmtpwijqhhzqiegpdbnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119542.8817172-3167-56872175008905/AnsiballZ_systemd.py'
Jan 22 22:05:43 compute-0 sudo[153963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:43 compute-0 python3.9[153965]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:05:43 compute-0 systemd[1]: Reloading.
Jan 22 22:05:43 compute-0 systemd-rc-local-generator[153991]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:05:43 compute-0 systemd-sysv-generator[153994]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:05:43 compute-0 setroubleshoot[153745]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9bf71cb8-832e-432a-bf80-cbd923929a0a
Jan 22 22:05:43 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 22 22:05:43 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 22 22:05:43 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 22 22:05:43 compute-0 setroubleshoot[153745]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 22 22:05:43 compute-0 setroubleshoot[153745]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9bf71cb8-832e-432a-bf80-cbd923929a0a
Jan 22 22:05:43 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:05:43 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 22 22:05:43 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 22 22:05:43 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 22 22:05:43 compute-0 setroubleshoot[153745]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 22 22:05:43 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 22 22:05:43 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 22 22:05:43 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 22 22:05:43 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 22 22:05:43 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 22:05:43 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 22 22:05:43 compute-0 sudo[153963]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:44 compute-0 sudo[154180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mebubnuurscxncezftfbwgnetmqmifdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119543.9238038-3167-57902120945446/AnsiballZ_systemd.py'
Jan 22 22:05:44 compute-0 sudo[154180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:44 compute-0 python3.9[154182]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:05:44 compute-0 systemd[1]: Reloading.
Jan 22 22:05:44 compute-0 systemd-rc-local-generator[154211]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:05:44 compute-0 systemd-sysv-generator[154215]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:05:44 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 22 22:05:44 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 22 22:05:44 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 22 22:05:44 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 22 22:05:44 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 22 22:05:44 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 22 22:05:44 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 22 22:05:44 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 22 22:05:44 compute-0 sudo[154180]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:45 compute-0 sudo[154392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjmkhotzfocogoernbtfcfidwzgtvrrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119545.2386186-3278-5784895750490/AnsiballZ_file.py'
Jan 22 22:05:45 compute-0 sudo[154392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:45 compute-0 python3.9[154394]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:45 compute-0 sudo[154392]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:46 compute-0 podman[154429]: 2026-01-22 22:05:46.124564828 +0000 UTC m=+0.059049885 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:05:46 compute-0 sudo[154561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luhrnbjjrkpfonaxerpdrfirlggypmet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119546.067782-3302-262657130816598/AnsiballZ_find.py'
Jan 22 22:05:46 compute-0 sudo[154561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:46 compute-0 python3.9[154563]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 22:05:46 compute-0 sudo[154561]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:47 compute-0 sudo[154715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsvemuljttriklyfyamhvgxvftvamhil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119547.1830163-3344-60223136177738/AnsiballZ_stat.py'
Jan 22 22:05:47 compute-0 sudo[154715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:47 compute-0 python3.9[154717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:47 compute-0 sudo[154715]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:47 compute-0 sshd-session[154634]: Connection closed by authenticating user root 134.209.61.246 port 38370 [preauth]
Jan 22 22:05:47 compute-0 sudo[154840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtyxkndldavgynghsztvesxfwvttsinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119547.1830163-3344-60223136177738/AnsiballZ_copy.py'
Jan 22 22:05:47 compute-0 sudo[154840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:48 compute-0 python3.9[154842]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119547.1830163-3344-60223136177738/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:48 compute-0 sudo[154840]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:48 compute-0 sshd-session[154764]: Connection closed by authenticating user root 134.209.61.246 port 57620 [preauth]
Jan 22 22:05:48 compute-0 sshd-session[154867]: Connection closed by authenticating user root 134.209.61.246 port 57632 [preauth]
Jan 22 22:05:48 compute-0 sudo[154996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnkmxkbzkcjhlnzcbtxjmuhpikfpxvds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119548.5866737-3392-48793706726482/AnsiballZ_file.py'
Jan 22 22:05:48 compute-0 sudo[154996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:49 compute-0 python3.9[154998]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:49 compute-0 sudo[154996]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:49 compute-0 sshd-session[154944]: Connection closed by authenticating user root 134.209.61.246 port 57646 [preauth]
Jan 22 22:05:49 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57654 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:49 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57662 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:49 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57674 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:49 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57688 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:49 compute-0 sudo[155148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaddmtzzgwktriitrslfkjwkzsiltsdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119549.2873669-3416-140430816149226/AnsiballZ_stat.py'
Jan 22 22:05:49 compute-0 sudo[155148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:49 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57694 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:49 compute-0 python3.9[155150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:49 compute-0 sudo[155148]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:49 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57704 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57714 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 sudo[155226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjbhimeulbdcsfhjxpmeaybpquawrbvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119549.2873669-3416-140430816149226/AnsiballZ_file.py'
Jan 22 22:05:50 compute-0 sudo[155226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:50 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57730 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57732 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 python3.9[155228]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:50 compute-0 sudo[155226]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:50 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57736 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57752 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57762 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57776 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:50 compute-0 sudo[155378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdnkmaxxvkwrkbqivnbitzstqbsgnvbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119550.592136-3452-269731737512836/AnsiballZ_stat.py'
Jan 22 22:05:50 compute-0 sudo[155378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:51 compute-0 sshd[129108]: drop connection #0 from [134.209.61.246]:57778 on [38.102.83.50]:22 penalty: failed authentication
Jan 22 22:05:51 compute-0 python3.9[155380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:51 compute-0 sudo[155378]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:51 compute-0 sudo[155456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrarixoltvvpapotfmsskrgvgraemmcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119550.592136-3452-269731737512836/AnsiballZ_file.py'
Jan 22 22:05:51 compute-0 sudo[155456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:51 compute-0 python3.9[155458]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.g9wgzdbb recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:51 compute-0 sudo[155456]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:52 compute-0 sudo[155608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysndvbcgcgfucqqcpayuglwostabqscl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119551.8363183-3488-43013874428583/AnsiballZ_stat.py'
Jan 22 22:05:52 compute-0 sudo[155608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:52 compute-0 python3.9[155610]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:52 compute-0 sudo[155608]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:52 compute-0 sudo[155686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddvijmwtmndpdajntobcdysencqdhuin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119551.8363183-3488-43013874428583/AnsiballZ_file.py'
Jan 22 22:05:52 compute-0 sudo[155686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:52 compute-0 python3.9[155688]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:52 compute-0 sudo[155686]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:53 compute-0 sudo[155838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wijwparkfxpzialubtkjclsfhjiykuam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119553.117333-3527-15239893131360/AnsiballZ_command.py'
Jan 22 22:05:53 compute-0 sudo[155838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:53 compute-0 python3.9[155840]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:05:53 compute-0 sudo[155838]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:53 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 22 22:05:53 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.022s CPU time.
Jan 22 22:05:53 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 22 22:05:54 compute-0 sudo[155991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqcwhzofzardsooapwlalktrlbnbfilz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119553.854603-3551-67657175668942/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 22:05:54 compute-0 sudo[155991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:54 compute-0 python3[155993]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 22:05:54 compute-0 sudo[155991]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:55 compute-0 sudo[156143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyfdjvktotkskybroskhamypwudnnmxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119554.715071-3575-193710691483130/AnsiballZ_stat.py'
Jan 22 22:05:55 compute-0 sudo[156143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:55 compute-0 python3.9[156145]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:55 compute-0 sudo[156143]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:55 compute-0 sudo[156221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcjupvhlaahhtytvjxuxzgmrxnguagez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119554.715071-3575-193710691483130/AnsiballZ_file.py'
Jan 22 22:05:55 compute-0 sudo[156221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:55 compute-0 python3.9[156223]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:55 compute-0 sudo[156221]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:56 compute-0 sudo[156373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isxvepdfvpklrkmxsuuizedgwufxrqqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119555.8723168-3611-49189971031075/AnsiballZ_stat.py'
Jan 22 22:05:56 compute-0 sudo[156373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:56 compute-0 python3.9[156375]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:56 compute-0 sudo[156373]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:56 compute-0 sudo[156498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltxoraevwawtlrfwokyrvhmcuddgvudu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119555.8723168-3611-49189971031075/AnsiballZ_copy.py'
Jan 22 22:05:56 compute-0 sudo[156498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:56 compute-0 python3.9[156500]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119555.8723168-3611-49189971031075/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:57 compute-0 sudo[156498]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:57 compute-0 sudo[156650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvkkelbpkwgtijsirmafbigcwfugceus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119557.2784162-3656-214755751080565/AnsiballZ_stat.py'
Jan 22 22:05:57 compute-0 sudo[156650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:57 compute-0 python3.9[156652]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:57 compute-0 sudo[156650]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:58 compute-0 sudo[156728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvclgeblgaakywwddlddvnuwhulqskvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119557.2784162-3656-214755751080565/AnsiballZ_file.py'
Jan 22 22:05:58 compute-0 sudo[156728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:58 compute-0 python3.9[156730]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:58 compute-0 sudo[156728]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:58 compute-0 sudo[156880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efyyyrruhifgjmtlgukvdyesagjnuvil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119558.4400666-3692-110362569498404/AnsiballZ_stat.py'
Jan 22 22:05:58 compute-0 sudo[156880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:58 compute-0 python3.9[156882]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:05:59 compute-0 sudo[156880]: pam_unix(sudo:session): session closed for user root
Jan 22 22:05:59 compute-0 sudo[156958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nugqybjmjlqkudvtunvjztcmmtvfwrox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119558.4400666-3692-110362569498404/AnsiballZ_file.py'
Jan 22 22:05:59 compute-0 sudo[156958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:05:59 compute-0 python3.9[156960]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:05:59 compute-0 sudo[156958]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:00 compute-0 sudo[157110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrtbrwafyjcidlweowkaenyrhrnbnnbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119560.0432336-3728-195528328274529/AnsiballZ_stat.py'
Jan 22 22:06:00 compute-0 sudo[157110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:00 compute-0 python3.9[157112]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:06:00 compute-0 sudo[157110]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:01 compute-0 sudo[157235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmonbzofayccmplaejmrccfgzvttgwix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119560.0432336-3728-195528328274529/AnsiballZ_copy.py'
Jan 22 22:06:01 compute-0 sudo[157235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:01 compute-0 python3.9[157237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119560.0432336-3728-195528328274529/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:01 compute-0 sudo[157235]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:01 compute-0 sudo[157387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpjzarjgbglbgvbdheavtrezzpitcotf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119561.4879508-3773-142323565897049/AnsiballZ_file.py'
Jan 22 22:06:01 compute-0 sudo[157387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:01 compute-0 python3.9[157389]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:02 compute-0 sudo[157387]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:02 compute-0 sudo[157539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpwwqcovjppuddvqxpshnyvwpjbqlgcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119562.174463-3797-156612260598226/AnsiballZ_command.py'
Jan 22 22:06:02 compute-0 sudo[157539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:02 compute-0 python3.9[157541]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:06:02 compute-0 sudo[157539]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:03 compute-0 sudo[157694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epxinftnyzorkmkpsqedsxhkjuxvzdrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119562.9531457-3821-125900172156393/AnsiballZ_blockinfile.py'
Jan 22 22:06:03 compute-0 sudo[157694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:03 compute-0 python3.9[157696]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:03 compute-0 sudo[157694]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:04 compute-0 sudo[157846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzyzqizhoiuxyxhkqzodkezioumduztz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119563.9403653-3848-159206553593743/AnsiballZ_command.py'
Jan 22 22:06:04 compute-0 sudo[157846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:04 compute-0 python3.9[157848]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:06:04 compute-0 sudo[157846]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:05 compute-0 sudo[157999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqpunkvvlmwnmjfrfjswkpcrwzcgqyfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119564.7620344-3872-187291291617674/AnsiballZ_stat.py'
Jan 22 22:06:05 compute-0 sudo[157999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:05 compute-0 python3.9[158001]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:06:05 compute-0 sudo[157999]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:05 compute-0 sudo[158153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfbvwitfktupnyszxznvmwqfcmubjrsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119565.6039484-3896-54853345505895/AnsiballZ_command.py'
Jan 22 22:06:05 compute-0 sudo[158153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:06 compute-0 python3.9[158155]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:06:06 compute-0 sudo[158153]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:06 compute-0 sudo[158308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxrazzsxsqsgwydhxjpwqwfoekaatmhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119566.340373-3920-257281812002948/AnsiballZ_file.py'
Jan 22 22:06:06 compute-0 sudo[158308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:06 compute-0 python3.9[158310]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:06 compute-0 sudo[158308]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:07 compute-0 sudo[158460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbjvtgchrlvxkkyqwuwhdsoykhnataye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119567.0586603-3944-146768924518300/AnsiballZ_stat.py'
Jan 22 22:06:07 compute-0 sudo[158460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:07 compute-0 python3.9[158462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:06:07 compute-0 sudo[158460]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:07 compute-0 sudo[158583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xujxhlzlymwxczbthhnerufybgsmpznq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119567.0586603-3944-146768924518300/AnsiballZ_copy.py'
Jan 22 22:06:07 compute-0 sudo[158583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:08 compute-0 python3.9[158585]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119567.0586603-3944-146768924518300/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:08 compute-0 sudo[158583]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:08 compute-0 sudo[158735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqoxrpmfkdwouwyzeldrrqnnvtyaoktw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119568.4906561-3989-135494951236465/AnsiballZ_stat.py'
Jan 22 22:06:08 compute-0 sudo[158735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:09 compute-0 python3.9[158737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:06:09 compute-0 sudo[158735]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:09 compute-0 sudo[158858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwebfnagochnwdvqkhtjznadudylysvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119568.4906561-3989-135494951236465/AnsiballZ_copy.py'
Jan 22 22:06:09 compute-0 sudo[158858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:09 compute-0 python3.9[158860]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119568.4906561-3989-135494951236465/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:09 compute-0 sudo[158858]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:10 compute-0 podman[158908]: 2026-01-22 22:06:10.213111739 +0000 UTC m=+0.127828342 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:06:10 compute-0 sudo[159037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byntpymbjjtxaugkuirhxidvsthsoila ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119570.0498993-4034-100649718877177/AnsiballZ_stat.py'
Jan 22 22:06:10 compute-0 sudo[159037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:10 compute-0 python3.9[159039]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:06:10 compute-0 sudo[159037]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:11 compute-0 sudo[159160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdguefpsehjfoseqtcfyitcurldsgdps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119570.0498993-4034-100649718877177/AnsiballZ_copy.py'
Jan 22 22:06:11 compute-0 sudo[159160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:11 compute-0 python3.9[159162]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119570.0498993-4034-100649718877177/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:11 compute-0 sudo[159160]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:11 compute-0 sudo[159312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdzikfgqoijowijjdjmkfkzlamovlnaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119571.507768-4079-220247480513334/AnsiballZ_systemd.py'
Jan 22 22:06:11 compute-0 sudo[159312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:12 compute-0 python3.9[159314]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:06:12 compute-0 systemd[1]: Reloading.
Jan 22 22:06:12 compute-0 systemd-rc-local-generator[159341]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:06:12 compute-0 systemd-sysv-generator[159345]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:06:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:06:12.413 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:06:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:06:12.414 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:06:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:06:12.414 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:06:12 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 22 22:06:12 compute-0 sudo[159312]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:13 compute-0 sudo[159503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kukjkpdsnxnfwnmsgvrrcpwdsvnpwcgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119572.7283967-4103-200689756240259/AnsiballZ_systemd.py'
Jan 22 22:06:13 compute-0 sudo[159503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:13 compute-0 python3.9[159505]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 22:06:13 compute-0 systemd[1]: Reloading.
Jan 22 22:06:13 compute-0 systemd-rc-local-generator[159533]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:06:13 compute-0 systemd-sysv-generator[159536]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:06:13 compute-0 systemd[1]: Reloading.
Jan 22 22:06:13 compute-0 systemd-rc-local-generator[159569]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:06:13 compute-0 systemd-sysv-generator[159572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:06:14 compute-0 sudo[159503]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:14 compute-0 sshd-session[104763]: Connection closed by 192.168.122.30 port 52542
Jan 22 22:06:14 compute-0 sshd-session[104760]: pam_unix(sshd:session): session closed for user zuul
Jan 22 22:06:14 compute-0 systemd-logind[801]: Session 22 logged out. Waiting for processes to exit.
Jan 22 22:06:14 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 22 22:06:14 compute-0 systemd[1]: session-22.scope: Consumed 3min 47.142s CPU time.
Jan 22 22:06:14 compute-0 systemd-logind[801]: Removed session 22.
Jan 22 22:06:17 compute-0 podman[159601]: 2026-01-22 22:06:17.160063146 +0000 UTC m=+0.093129422 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:06:19 compute-0 sshd-session[159621]: Accepted publickey for zuul from 192.168.122.30 port 53422 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 22:06:19 compute-0 systemd-logind[801]: New session 23 of user zuul.
Jan 22 22:06:19 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 22 22:06:19 compute-0 sshd-session[159621]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 22:06:20 compute-0 python3.9[159774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 22:06:22 compute-0 python3.9[159928]: ansible-ansible.builtin.service_facts Invoked
Jan 22 22:06:22 compute-0 network[159945]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 22:06:22 compute-0 network[159946]: 'network-scripts' will be removed from distribution in near future.
Jan 22 22:06:22 compute-0 network[159947]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 22:06:28 compute-0 sudo[160216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrkdfbkzakmflytttxadfpkguphzwslb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119587.9010715-101-2559903522982/AnsiballZ_setup.py'
Jan 22 22:06:28 compute-0 sudo[160216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:28 compute-0 python3.9[160218]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 22:06:28 compute-0 sudo[160216]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:29 compute-0 sudo[160300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adonfvurxqvdahyuipzqfgnbfgzchpgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119587.9010715-101-2559903522982/AnsiballZ_dnf.py'
Jan 22 22:06:29 compute-0 sudo[160300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:29 compute-0 python3.9[160302]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 22:06:35 compute-0 sudo[160300]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:35 compute-0 sudo[160453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlyfbeygeinwnvlkwdxwawfmepxtupev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119595.5515492-137-93999480797139/AnsiballZ_stat.py'
Jan 22 22:06:35 compute-0 sudo[160453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:36 compute-0 python3.9[160455]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:06:36 compute-0 sudo[160453]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:37 compute-0 sudo[160605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obqsaddqjmoignumbawkqsvhqoyesaul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119596.5244505-167-246898366889633/AnsiballZ_command.py'
Jan 22 22:06:37 compute-0 sudo[160605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:37 compute-0 python3.9[160607]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:06:37 compute-0 sudo[160605]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:38 compute-0 sudo[160758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhdkoubgoyrexjzqgoixtytaklyjdfyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119597.6312823-197-149311683023069/AnsiballZ_stat.py'
Jan 22 22:06:38 compute-0 sudo[160758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:38 compute-0 python3.9[160760]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:06:38 compute-0 sudo[160758]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:39 compute-0 sudo[160910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcqxrifesaahepentpbcbaqnqwqribpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119598.5425732-221-59389941020221/AnsiballZ_command.py'
Jan 22 22:06:39 compute-0 sudo[160910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:39 compute-0 python3.9[160912]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:06:39 compute-0 sudo[160910]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:40 compute-0 sudo[161074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zornqybpbcfyknbpdgguprvqwnbrkpxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119600.0853877-245-232597345997479/AnsiballZ_stat.py'
Jan 22 22:06:40 compute-0 sudo[161074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:40 compute-0 podman[161037]: 2026-01-22 22:06:40.51189723 +0000 UTC m=+0.132753044 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 22:06:40 compute-0 python3.9[161080]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:06:40 compute-0 sudo[161074]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:41 compute-0 sudo[161211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvzxpvpsqvhxfwupjhxatwiiqfwngrtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119600.0853877-245-232597345997479/AnsiballZ_copy.py'
Jan 22 22:06:41 compute-0 sudo[161211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:41 compute-0 python3.9[161213]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119600.0853877-245-232597345997479/.source.iscsi _original_basename=.2eztxoyy follow=False checksum=b06dfcfb7db2243c2c60fed65c07b0843cb6f9bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:41 compute-0 sudo[161211]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:41 compute-0 sudo[161363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhjshqwlzanzymdovgupsloijqncexv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119601.5628319-290-87273282830918/AnsiballZ_file.py'
Jan 22 22:06:41 compute-0 sudo[161363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:42 compute-0 python3.9[161365]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:42 compute-0 sudo[161363]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:42 compute-0 sudo[161515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gotoowyciupvphjfnipjspidwzehpihh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119602.4059112-314-17869875752322/AnsiballZ_lineinfile.py'
Jan 22 22:06:42 compute-0 sudo[161515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:43 compute-0 python3.9[161517]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:06:43 compute-0 sudo[161515]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:44 compute-0 sudo[161667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idzretjzgndnhsazxnbxlofrleicfodj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119603.411255-341-176098906940853/AnsiballZ_systemd_service.py'
Jan 22 22:06:44 compute-0 sudo[161667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:44 compute-0 python3.9[161669]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:06:44 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 22 22:06:44 compute-0 sudo[161667]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:45 compute-0 sudo[161823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrbysxefqjebzelncwgnroidhbfrykrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119605.658176-365-146112662029037/AnsiballZ_systemd_service.py'
Jan 22 22:06:45 compute-0 sudo[161823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:46 compute-0 python3.9[161825]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:06:46 compute-0 systemd[1]: Reloading.
Jan 22 22:06:46 compute-0 systemd-rc-local-generator[161855]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:06:46 compute-0 systemd-sysv-generator[161859]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:06:46 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 22:06:46 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 22 22:06:46 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 22 22:06:46 compute-0 systemd[1]: Started Open-iSCSI.
Jan 22 22:06:46 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 22 22:06:46 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 22 22:06:46 compute-0 sudo[161823]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:47 compute-0 podman[161999]: 2026-01-22 22:06:47.689711824 +0000 UTC m=+0.070402154 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 22:06:47 compute-0 python3.9[162036]: ansible-ansible.builtin.service_facts Invoked
Jan 22 22:06:47 compute-0 network[162061]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 22:06:47 compute-0 network[162062]: 'network-scripts' will be removed from distribution in near future.
Jan 22 22:06:47 compute-0 network[162063]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 22:06:54 compute-0 sudo[162332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnwpqyanpfvyytdzmdiwrmrqkcfbysya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119613.6794088-434-189521473603098/AnsiballZ_dnf.py'
Jan 22 22:06:54 compute-0 sudo[162332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:06:54 compute-0 python3.9[162334]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 22:06:57 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 22:06:57 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 22:06:57 compute-0 systemd[1]: Reloading.
Jan 22 22:06:57 compute-0 systemd-rc-local-generator[162381]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:06:57 compute-0 systemd-sysv-generator[162385]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:06:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 22:06:58 compute-0 sudo[162332]: pam_unix(sudo:session): session closed for user root
Jan 22 22:06:59 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 22:06:59 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 22:06:59 compute-0 systemd[1]: run-rff4f4e1887c04a1186141b8f7894b08f.service: Deactivated successfully.
Jan 22 22:07:00 compute-0 sudo[162648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asvfnzwqwvriatkugnwbsgsfljjoyvjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119619.7486863-461-92413489172847/AnsiballZ_file.py'
Jan 22 22:07:00 compute-0 sudo[162648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:00 compute-0 python3.9[162650]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 22:07:00 compute-0 sudo[162648]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:00 compute-0 sudo[162800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lazukbfkevmyysugjwiljgsvwkezfcla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119620.4854114-485-9732264734499/AnsiballZ_modprobe.py'
Jan 22 22:07:00 compute-0 sudo[162800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:01 compute-0 python3.9[162802]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 22 22:07:01 compute-0 sudo[162800]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:01 compute-0 sudo[162956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfdxlbxqdzgzdufprcrksmhoamejaapj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119621.377363-509-82353085578130/AnsiballZ_stat.py'
Jan 22 22:07:01 compute-0 sudo[162956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:01 compute-0 python3.9[162958]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:07:01 compute-0 sudo[162956]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:02 compute-0 sudo[163079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaswviiphwuhgajyushonwfjmzhzopmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119621.377363-509-82353085578130/AnsiballZ_copy.py'
Jan 22 22:07:02 compute-0 sudo[163079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:02 compute-0 python3.9[163081]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119621.377363-509-82353085578130/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:02 compute-0 sudo[163079]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:03 compute-0 sudo[163231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntjslczfhbzqvioywiirtzjppbykaowk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119622.7879298-557-118461542290809/AnsiballZ_lineinfile.py'
Jan 22 22:07:03 compute-0 sudo[163231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:03 compute-0 python3.9[163233]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:03 compute-0 sudo[163231]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:04 compute-0 sudo[163383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjwajvorjzerdbkecwkxfjkkbckxdfhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119623.5071108-581-209340245317732/AnsiballZ_systemd.py'
Jan 22 22:07:04 compute-0 sudo[163383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:04 compute-0 python3.9[163385]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:07:04 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 22:07:04 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 22 22:07:04 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 22 22:07:04 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 22 22:07:04 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 22 22:07:04 compute-0 sudo[163383]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:05 compute-0 sudo[163539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uccinpvesgjfrpdjhkxtzgaepaivgmkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119624.744478-605-237411421452438/AnsiballZ_command.py'
Jan 22 22:07:05 compute-0 sudo[163539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:05 compute-0 python3.9[163541]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:07:05 compute-0 sudo[163539]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:05 compute-0 sudo[163692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kekmvlxdbmncnoegbyvepgazwuenaqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119625.632226-635-261783441599129/AnsiballZ_stat.py'
Jan 22 22:07:05 compute-0 sudo[163692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:06 compute-0 python3.9[163694]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:07:06 compute-0 sudo[163692]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:06 compute-0 sudo[163844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-delobndtvapbnkswgahdhwuidvvetffb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119626.3918529-662-267847230200841/AnsiballZ_stat.py'
Jan 22 22:07:06 compute-0 sudo[163844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:06 compute-0 python3.9[163846]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:07:06 compute-0 sudo[163844]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:07 compute-0 sudo[163967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqeptrgnvtezypvvknnltqrcafflvptb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119626.3918529-662-267847230200841/AnsiballZ_copy.py'
Jan 22 22:07:07 compute-0 sudo[163967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:07 compute-0 python3.9[163969]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119626.3918529-662-267847230200841/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:07 compute-0 sudo[163967]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:08 compute-0 sudo[164119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjzcsyhpwlmnsvogmgnkelzifnxbkqan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119627.7585566-707-161100865336501/AnsiballZ_command.py'
Jan 22 22:07:08 compute-0 sudo[164119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:08 compute-0 python3.9[164121]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:07:08 compute-0 sudo[164119]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:09 compute-0 sudo[164272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfeowwekoxvskfdjdeercvjmxsizqqgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119628.6843688-731-126248891919427/AnsiballZ_lineinfile.py'
Jan 22 22:07:09 compute-0 sudo[164272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:09 compute-0 python3.9[164274]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:09 compute-0 sudo[164272]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:10 compute-0 sudo[164424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyjyxxxbicsobwrcisguxozpskjffxcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119629.4683466-755-77434768845714/AnsiballZ_replace.py'
Jan 22 22:07:10 compute-0 sudo[164424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:10 compute-0 python3.9[164426]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:10 compute-0 sudo[164424]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:10 compute-0 sudo[164588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wufkqgdtzfburzaipfepqxfbgehgeyto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119630.4694326-779-218425783306386/AnsiballZ_replace.py'
Jan 22 22:07:10 compute-0 sudo[164588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:10 compute-0 podman[164550]: 2026-01-22 22:07:10.872493433 +0000 UTC m=+0.111216195 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:07:11 compute-0 python3.9[164597]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:11 compute-0 sudo[164588]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:11 compute-0 sudo[164754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byohynzufrqxdarrbawupnfxpcdhighl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119631.2942262-806-177141307402888/AnsiballZ_lineinfile.py'
Jan 22 22:07:11 compute-0 sudo[164754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:11 compute-0 python3.9[164756]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:11 compute-0 sudo[164754]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:12 compute-0 sudo[164906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgnylsahmtseepujccbpzypugshvjwqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119631.9951575-806-114691749695661/AnsiballZ_lineinfile.py'
Jan 22 22:07:12 compute-0 sudo[164906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:07:12.415 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:07:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:07:12.415 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:07:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:07:12.416 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:07:12 compute-0 python3.9[164908]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:12 compute-0 sudo[164906]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:13 compute-0 sudo[165058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hejdofqclnlazrzrxdapqfwdpadnvgsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119632.7767692-806-256235593752724/AnsiballZ_lineinfile.py'
Jan 22 22:07:13 compute-0 sudo[165058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:13 compute-0 python3.9[165060]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:13 compute-0 sudo[165058]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:13 compute-0 sudo[165210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awiuhrbuawkaikmoegbxssclvjckkxcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119633.4572732-806-252842146221644/AnsiballZ_lineinfile.py'
Jan 22 22:07:13 compute-0 sudo[165210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:14 compute-0 python3.9[165212]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:14 compute-0 sudo[165210]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:14 compute-0 sudo[165362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbkhrdmsrkqogyljgkemsfjfjwhratny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119634.2887468-893-229689938945629/AnsiballZ_stat.py'
Jan 22 22:07:14 compute-0 sudo[165362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:14 compute-0 python3.9[165364]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:07:14 compute-0 sudo[165362]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:15 compute-0 sudo[165516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkpvgpnferhrzckvtvxbrxqfltqzfspo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119635.089482-917-219338405374659/AnsiballZ_command.py'
Jan 22 22:07:15 compute-0 sudo[165516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:15 compute-0 python3.9[165518]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:07:15 compute-0 sudo[165516]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:16 compute-0 sudo[165669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shfutcczwftbcjqvrcjarczunscesqao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119635.955199-944-16256361032162/AnsiballZ_systemd_service.py'
Jan 22 22:07:16 compute-0 sudo[165669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:16 compute-0 python3.9[165671]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:16 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 22 22:07:16 compute-0 sudo[165669]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:17 compute-0 sudo[165825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yraassiwxrevjlucuilspjhalzwmkjul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119636.9206436-968-251818947883162/AnsiballZ_systemd_service.py'
Jan 22 22:07:17 compute-0 sudo[165825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:17 compute-0 python3.9[165827]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:17 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 22 22:07:17 compute-0 udevadm[165832]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 22 22:07:17 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 22 22:07:17 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 22:07:17 compute-0 multipathd[165841]: --------start up--------
Jan 22 22:07:17 compute-0 multipathd[165841]: read /etc/multipath.conf
Jan 22 22:07:17 compute-0 multipathd[165841]: path checkers start up
Jan 22 22:07:17 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 22:07:17 compute-0 podman[165833]: 2026-01-22 22:07:17.801767094 +0000 UTC m=+0.074949251 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:07:17 compute-0 sudo[165825]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:18 compute-0 sudo[166009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvsdojnzlrcrtegbijghmxfymsmcstrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119638.3310711-1004-190220542462723/AnsiballZ_file.py'
Jan 22 22:07:18 compute-0 sudo[166009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:18 compute-0 python3.9[166011]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 22:07:18 compute-0 sudo[166009]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:19 compute-0 sudo[166161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtbajbrangszueukhrwtkacwdbpnwtur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119639.1983578-1028-265049981337/AnsiballZ_modprobe.py'
Jan 22 22:07:19 compute-0 sudo[166161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:19 compute-0 python3.9[166163]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 22 22:07:19 compute-0 kernel: Key type psk registered
Jan 22 22:07:19 compute-0 sudo[166161]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:20 compute-0 sudo[166324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcmdqsajzbalcramntvddtpmudhezmtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119639.9886367-1052-235851577543489/AnsiballZ_stat.py'
Jan 22 22:07:20 compute-0 sudo[166324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:20 compute-0 python3.9[166326]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:07:20 compute-0 sudo[166324]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:21 compute-0 sudo[166447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcvjbeejkyobtlfwjnidafsnswwiktui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119639.9886367-1052-235851577543489/AnsiballZ_copy.py'
Jan 22 22:07:21 compute-0 sudo[166447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:21 compute-0 python3.9[166449]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119639.9886367-1052-235851577543489/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:21 compute-0 sudo[166447]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:21 compute-0 sudo[166599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbrzlemzvlhavehwnhhebcnbrrznwkup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119641.5916924-1100-80784391337038/AnsiballZ_lineinfile.py'
Jan 22 22:07:21 compute-0 sudo[166599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:22 compute-0 python3.9[166601]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:22 compute-0 sudo[166599]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:22 compute-0 sudo[166751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsqcfismvptxrrfroctjalfrlyrvmakc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119642.34061-1124-251328655662012/AnsiballZ_systemd.py'
Jan 22 22:07:22 compute-0 sudo[166751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:22 compute-0 python3.9[166753]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:07:22 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 22:07:22 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 22 22:07:22 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 22 22:07:22 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 22 22:07:22 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 22 22:07:23 compute-0 sudo[166751]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:23 compute-0 sudo[166907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-driurwkwqrdhbzuzpsiwhdjowrxjlwfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119643.3812084-1148-67831921518704/AnsiballZ_dnf.py'
Jan 22 22:07:23 compute-0 sudo[166907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:23 compute-0 python3.9[166909]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 22:07:26 compute-0 systemd[1]: Reloading.
Jan 22 22:07:26 compute-0 systemd-rc-local-generator[166936]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:07:26 compute-0 systemd-sysv-generator[166939]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:07:26 compute-0 systemd[1]: Reloading.
Jan 22 22:07:26 compute-0 systemd-sysv-generator[166979]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:07:26 compute-0 systemd-rc-local-generator[166976]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:07:26 compute-0 systemd-logind[801]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 22:07:26 compute-0 systemd-logind[801]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 22:07:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 22:07:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 22 22:07:27 compute-0 systemd[1]: Reloading.
Jan 22 22:07:27 compute-0 systemd-rc-local-generator[167073]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:07:27 compute-0 systemd-sysv-generator[167076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:07:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 22:07:28 compute-0 sudo[166907]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:29 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 22:07:29 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 22 22:07:29 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.851s CPU time.
Jan 22 22:07:29 compute-0 systemd[1]: run-r60323accc1e74c36a3986caf075df45a.service: Deactivated successfully.
Jan 22 22:07:29 compute-0 sudo[168372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roeldbrnevunbyvdzpruxtqpocamoefq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119649.1644545-1172-96999786838099/AnsiballZ_systemd_service.py'
Jan 22 22:07:29 compute-0 sudo[168372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:29 compute-0 python3.9[168374]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:07:29 compute-0 iscsid[161866]: iscsid shutting down.
Jan 22 22:07:29 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 22 22:07:29 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 22 22:07:29 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 22 22:07:29 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 22:07:29 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 22 22:07:29 compute-0 systemd[1]: Started Open-iSCSI.
Jan 22 22:07:29 compute-0 sudo[168372]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:30 compute-0 sudo[168529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anqmowmetmcssgmlyhmyszcelxqbmdzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119650.111625-1196-33067633336438/AnsiballZ_systemd_service.py'
Jan 22 22:07:30 compute-0 sudo[168529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:30 compute-0 python3.9[168531]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:07:30 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 22 22:07:30 compute-0 multipathd[165841]: exit (signal)
Jan 22 22:07:30 compute-0 multipathd[165841]: --------shut down-------
Jan 22 22:07:30 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 22 22:07:30 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 22 22:07:30 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 22:07:30 compute-0 multipathd[168538]: --------start up--------
Jan 22 22:07:30 compute-0 multipathd[168538]: read /etc/multipath.conf
Jan 22 22:07:30 compute-0 multipathd[168538]: path checkers start up
Jan 22 22:07:30 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 22:07:30 compute-0 sudo[168529]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:31 compute-0 python3.9[168695]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 22:07:32 compute-0 sudo[168849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwwbynujpmnaastbinessgzjwhyxzbsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119652.2290795-1248-137945771018315/AnsiballZ_file.py'
Jan 22 22:07:32 compute-0 sudo[168849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:32 compute-0 python3.9[168851]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:32 compute-0 sudo[168849]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:33 compute-0 sudo[169001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icihwlcnjwvauhyfcprytjflcutyzkwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119653.1921606-1281-96708946478592/AnsiballZ_systemd_service.py'
Jan 22 22:07:33 compute-0 sudo[169001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:33 compute-0 python3.9[169003]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:07:33 compute-0 systemd[1]: Reloading.
Jan 22 22:07:33 compute-0 systemd-rc-local-generator[169033]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:07:33 compute-0 systemd-sysv-generator[169037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:07:34 compute-0 sudo[169001]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:35 compute-0 python3.9[169189]: ansible-ansible.builtin.service_facts Invoked
Jan 22 22:07:35 compute-0 network[169206]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 22:07:35 compute-0 network[169207]: 'network-scripts' will be removed from distribution in near future.
Jan 22 22:07:35 compute-0 network[169208]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 22:07:36 compute-0 sshd-session[169253]: Unable to negotiate with 13.76.211.192 port 57975: no matching host key type found. Their offer: ssh-rsa,ssh-dss [preauth]
Jan 22 22:07:38 compute-0 sudo[169480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jotlcforutrjrhbxybunsvvawjhhcxrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119658.6577053-1338-89376154040478/AnsiballZ_systemd_service.py'
Jan 22 22:07:39 compute-0 sudo[169480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:39 compute-0 python3.9[169482]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:39 compute-0 sudo[169480]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:39 compute-0 sudo[169633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjzywlpxlhsxyizvuexnxqueitytnbao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119659.4635925-1338-169004470542613/AnsiballZ_systemd_service.py'
Jan 22 22:07:39 compute-0 sudo[169633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:40 compute-0 python3.9[169635]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:40 compute-0 sudo[169633]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:40 compute-0 sudo[169786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfbltssuukaatqdadtgewwvfibrmvvid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119660.316771-1338-213200676652332/AnsiballZ_systemd_service.py'
Jan 22 22:07:40 compute-0 sudo[169786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:40 compute-0 python3.9[169788]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:40 compute-0 sudo[169786]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:41 compute-0 podman[169790]: 2026-01-22 22:07:41.096275006 +0000 UTC m=+0.113280947 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 22 22:07:41 compute-0 sudo[169965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgijtwzmbkcpxjdtldxbjyeudsazxkiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119661.1324627-1338-769151384607/AnsiballZ_systemd_service.py'
Jan 22 22:07:41 compute-0 sudo[169965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:41 compute-0 python3.9[169967]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:41 compute-0 sudo[169965]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:42 compute-0 sudo[170119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgetdyjnfxkxkjotwydzmdkabzhwktv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119661.9364266-1338-136679465312775/AnsiballZ_systemd_service.py'
Jan 22 22:07:42 compute-0 sudo[170119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:42 compute-0 python3.9[170121]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:42 compute-0 sudo[170119]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:42 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 22:07:42 compute-0 sudo[170273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-citgpucjxqjszweemfmbcopnfwyliock ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119662.6826825-1338-281079372882769/AnsiballZ_systemd_service.py'
Jan 22 22:07:42 compute-0 sudo[170273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:43 compute-0 python3.9[170275]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:43 compute-0 sudo[170273]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:43 compute-0 sudo[170426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpifinejmadgdjfecojpselvjgdxmbub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119663.4261394-1338-95673193424864/AnsiballZ_systemd_service.py'
Jan 22 22:07:43 compute-0 sudo[170426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:43 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 22 22:07:43 compute-0 python3.9[170428]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:43 compute-0 sudo[170426]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:44 compute-0 sudo[170580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuggckddlzoxupzsazurnufyzpjwpjpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119664.083153-1338-43771432909657/AnsiballZ_systemd_service.py'
Jan 22 22:07:44 compute-0 sudo[170580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:44 compute-0 python3.9[170582]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:07:44 compute-0 sudo[170580]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:44 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 22 22:07:45 compute-0 sudo[170734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezcjfjnzytfimmuwwyzzntsbdmcdtxew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119665.2183092-1515-12606777349901/AnsiballZ_file.py'
Jan 22 22:07:45 compute-0 sudo[170734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:45 compute-0 python3.9[170736]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:45 compute-0 sudo[170734]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:46 compute-0 sudo[170886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqqjyshyadrvtnjgexitwfennneteoph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119665.750798-1515-13086708840772/AnsiballZ_file.py'
Jan 22 22:07:46 compute-0 sudo[170886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:46 compute-0 python3.9[170888]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:46 compute-0 sudo[170886]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:46 compute-0 sudo[171038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dufuymtyvyawpvugonehofkmfzbvwjfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119666.3819134-1515-26818340164721/AnsiballZ_file.py'
Jan 22 22:07:46 compute-0 sudo[171038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:46 compute-0 python3.9[171040]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:46 compute-0 sudo[171038]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:47 compute-0 sudo[171190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayqkbvcmpizheyqxsmpnzzekvbvxyjob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119666.9554055-1515-192625924785169/AnsiballZ_file.py'
Jan 22 22:07:47 compute-0 sudo[171190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:47 compute-0 python3.9[171192]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:47 compute-0 sudo[171190]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:47 compute-0 sudo[171353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkyhufogwcdfesowycchciqheuxpmnqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119667.671748-1515-164601965393834/AnsiballZ_file.py'
Jan 22 22:07:47 compute-0 sudo[171353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:47 compute-0 podman[171316]: 2026-01-22 22:07:47.953662954 +0000 UTC m=+0.061516162 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:07:48 compute-0 python3.9[171355]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:48 compute-0 sudo[171353]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:48 compute-0 sudo[171511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mycdhqwobczuasclvipyhaicdqqpyrgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119668.2862-1515-73729263292365/AnsiballZ_file.py'
Jan 22 22:07:48 compute-0 sudo[171511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:48 compute-0 python3.9[171513]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:48 compute-0 sudo[171511]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:49 compute-0 sudo[171663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkglmahxaczwrvgfnbrlasbzbpwiiokr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119668.8930645-1515-43406267834224/AnsiballZ_file.py'
Jan 22 22:07:49 compute-0 sudo[171663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:49 compute-0 python3.9[171665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:49 compute-0 sudo[171663]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:49 compute-0 sudo[171815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acssopsceagbepyxjvtadtuwdgpddfik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119669.52463-1515-128344663722272/AnsiballZ_file.py'
Jan 22 22:07:49 compute-0 sudo[171815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:49 compute-0 python3.9[171817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:50 compute-0 sudo[171815]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:50 compute-0 sudo[171967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmjywjuqbxtqnvqmckujgvtdvbaocvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119670.6563106-1686-230543251013523/AnsiballZ_file.py'
Jan 22 22:07:50 compute-0 sudo[171967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:51 compute-0 python3.9[171969]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:51 compute-0 sudo[171967]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:51 compute-0 sudo[172119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxdrcijkrofwwhhhcxnjcyfsvjmegoxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119671.3408628-1686-119623964408837/AnsiballZ_file.py'
Jan 22 22:07:51 compute-0 sudo[172119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:51 compute-0 python3.9[172121]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:51 compute-0 sudo[172119]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:52 compute-0 sudo[172271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhstkikzzynlzbsijmqyylejdvfugmas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119671.9838872-1686-41768311879621/AnsiballZ_file.py'
Jan 22 22:07:52 compute-0 sudo[172271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:52 compute-0 python3.9[172273]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:52 compute-0 sudo[172271]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:53 compute-0 sudo[172423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovfxyolfjambvsradfogsaanizxhjgsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119672.7386594-1686-107388347367327/AnsiballZ_file.py'
Jan 22 22:07:53 compute-0 sudo[172423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:53 compute-0 python3.9[172425]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:53 compute-0 sudo[172423]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:53 compute-0 sudo[172575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqpozveyduqtfgnejdctammkjaiaxntv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119673.3985925-1686-134480656053681/AnsiballZ_file.py'
Jan 22 22:07:53 compute-0 sudo[172575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:53 compute-0 python3.9[172577]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:53 compute-0 sudo[172575]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:54 compute-0 sudo[172727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylmxlcirlsothvnmjqojwtdvtfyuyenn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119674.0325258-1686-79744876942490/AnsiballZ_file.py'
Jan 22 22:07:54 compute-0 sudo[172727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:54 compute-0 python3.9[172729]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:54 compute-0 sudo[172727]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:55 compute-0 sudo[172879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxkcokfpwktkbadklfrpzweaajystdni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119674.7142415-1686-255210742143488/AnsiballZ_file.py'
Jan 22 22:07:55 compute-0 sudo[172879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:55 compute-0 python3.9[172881]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:55 compute-0 sudo[172879]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:55 compute-0 sudo[173031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlqswdikayaorvrqgnwgezuzjagcoeki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119675.3924778-1686-88384785926144/AnsiballZ_file.py'
Jan 22 22:07:55 compute-0 sudo[173031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:55 compute-0 python3.9[173033]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:07:55 compute-0 sudo[173031]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:56 compute-0 sudo[173183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifqdsgcszrflozamlncrfnaarpxzcsyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119676.1919131-1860-230140582921074/AnsiballZ_command.py'
Jan 22 22:07:56 compute-0 sudo[173183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:56 compute-0 python3.9[173185]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:07:56 compute-0 sudo[173183]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:57 compute-0 python3.9[173337]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 22:07:58 compute-0 sudo[173487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxpslpdaqzhafkagtemfiwevsbsnhial ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119678.049533-1914-152234476048035/AnsiballZ_systemd_service.py'
Jan 22 22:07:58 compute-0 sudo[173487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:58 compute-0 python3.9[173489]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:07:58 compute-0 systemd[1]: Reloading.
Jan 22 22:07:58 compute-0 systemd-rc-local-generator[173517]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:07:58 compute-0 systemd-sysv-generator[173521]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:07:59 compute-0 sudo[173487]: pam_unix(sudo:session): session closed for user root
Jan 22 22:07:59 compute-0 sudo[173675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhzaqwczzzdptnnwsvwtrrkdupgympex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119679.2308118-1938-234312339898056/AnsiballZ_command.py'
Jan 22 22:07:59 compute-0 sudo[173675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:07:59 compute-0 python3.9[173677]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:07:59 compute-0 sudo[173675]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:00 compute-0 sudo[173828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcysigvfxfdwpanohxomdbjprgtjqknq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119679.9397345-1938-123316302261539/AnsiballZ_command.py'
Jan 22 22:08:00 compute-0 sudo[173828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:00 compute-0 python3.9[173830]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:08:00 compute-0 sudo[173828]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:00 compute-0 sudo[173981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtesvzpyzihfbboqmybyecfewrataadj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119680.689857-1938-24943721094175/AnsiballZ_command.py'
Jan 22 22:08:00 compute-0 sudo[173981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:01 compute-0 python3.9[173983]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:08:01 compute-0 sudo[173981]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:01 compute-0 sudo[174134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihpcvbohjumvpjqttkuiqvikjefzoshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119681.4371738-1938-29966424603936/AnsiballZ_command.py'
Jan 22 22:08:01 compute-0 sudo[174134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:02 compute-0 python3.9[174136]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:08:02 compute-0 sudo[174134]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:02 compute-0 sudo[174287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcaawtmhcxoavpgwujcwgvkgiifamewp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119682.2654119-1938-199161949718109/AnsiballZ_command.py'
Jan 22 22:08:02 compute-0 sudo[174287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:02 compute-0 python3.9[174289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:08:02 compute-0 sudo[174287]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:03 compute-0 sudo[174440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naahhdhwcphevjnqikmknqzvufabsnoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119682.8874497-1938-122622727551552/AnsiballZ_command.py'
Jan 22 22:08:03 compute-0 sudo[174440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:03 compute-0 python3.9[174442]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:08:03 compute-0 sudo[174440]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:03 compute-0 sudo[174593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dalmlfxqzjfvelwptpkvtirtnnkdyloy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119683.470014-1938-178500726585507/AnsiballZ_command.py'
Jan 22 22:08:03 compute-0 sudo[174593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:03 compute-0 python3.9[174595]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:08:03 compute-0 sudo[174593]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:04 compute-0 sudo[174746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nggzraqljizclfcnriexxkpgcagjufdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119684.089061-1938-208078687303178/AnsiballZ_command.py'
Jan 22 22:08:04 compute-0 sudo[174746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:04 compute-0 python3.9[174748]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:08:04 compute-0 sudo[174746]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:06 compute-0 sudo[174899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqybyaxwybsttvvyhmpiuujqkfdlossk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119685.9430356-2145-79326869086635/AnsiballZ_file.py'
Jan 22 22:08:06 compute-0 sudo[174899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:06 compute-0 python3.9[174901]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:06 compute-0 sudo[174899]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:06 compute-0 sudo[175051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqbpxspqekvvjwxzpfjjpqklelqvachs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119686.6126423-2145-235740269197519/AnsiballZ_file.py'
Jan 22 22:08:06 compute-0 sudo[175051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:07 compute-0 python3.9[175053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:07 compute-0 sudo[175051]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:07 compute-0 sudo[175203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swlbprjczlinblwudhjxdibbpedcznfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119687.3593721-2145-161886287824668/AnsiballZ_file.py'
Jan 22 22:08:07 compute-0 sudo[175203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:07 compute-0 python3.9[175205]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:07 compute-0 sudo[175203]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:08 compute-0 sudo[175355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qowfgaftavnnwmvmuvderlhbyltxehhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119688.5287757-2211-207122559466004/AnsiballZ_file.py'
Jan 22 22:08:08 compute-0 sudo[175355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:09 compute-0 python3.9[175357]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:09 compute-0 sudo[175355]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:09 compute-0 sudo[175507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkdluqentgtjwlwcqzrikpvioziaugot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119689.2394848-2211-82745746020530/AnsiballZ_file.py'
Jan 22 22:08:09 compute-0 sudo[175507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:09 compute-0 python3.9[175509]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:09 compute-0 sudo[175507]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:10 compute-0 sudo[175659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdcufjkvtykwshamqxwwspnkkcuxtsjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119689.9508004-2211-108720841150492/AnsiballZ_file.py'
Jan 22 22:08:10 compute-0 sudo[175659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:10 compute-0 python3.9[175661]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:10 compute-0 sudo[175659]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:11 compute-0 sudo[175811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxinyjsznyyvnetlznhqafgveuxgxgzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119690.7021039-2211-88075255967440/AnsiballZ_file.py'
Jan 22 22:08:11 compute-0 sudo[175811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:11 compute-0 python3.9[175813]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:11 compute-0 sudo[175811]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:11 compute-0 virtnodedevd[153570]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 22 22:08:11 compute-0 virtnodedevd[153570]: hostname: compute-0
Jan 22 22:08:11 compute-0 virtnodedevd[153570]: Make forcefull daemon shutdown
Jan 22 22:08:11 compute-0 systemd[1]: virtnodedevd.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 22:08:11 compute-0 systemd[1]: virtnodedevd.service: Failed with result 'exit-code'.
Jan 22 22:08:11 compute-0 podman[175890]: 2026-01-22 22:08:11.664255022 +0000 UTC m=+0.084867826 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:08:11 compute-0 systemd[1]: virtnodedevd.service: Scheduled restart job, restart counter is at 1.
Jan 22 22:08:11 compute-0 systemd[1]: Stopped libvirt nodedev daemon.
Jan 22 22:08:11 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 22:08:11 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 22 22:08:11 compute-0 sudo[176010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvtthywimzxspkvfqnbtssbjtquyokqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119691.4696758-2211-218382737586134/AnsiballZ_file.py'
Jan 22 22:08:11 compute-0 sudo[176010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:11 compute-0 python3.9[176012]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:12 compute-0 sudo[176010]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:08:12.416 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:08:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:08:12.416 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:08:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:08:12.417 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:08:12 compute-0 sudo[176164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvzekdcfhhreniwdhcydjbcnolhtoaua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119692.1830835-2211-54484810781475/AnsiballZ_file.py'
Jan 22 22:08:12 compute-0 sudo[176164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:12 compute-0 python3.9[176166]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:12 compute-0 sudo[176164]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:13 compute-0 sudo[176316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffajsfbbyrqdpjxlvtzxbklyqybuiily ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119693.1194632-2211-245681913087032/AnsiballZ_file.py'
Jan 22 22:08:13 compute-0 sudo[176316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:13 compute-0 python3.9[176318]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:13 compute-0 sudo[176316]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:18 compute-0 podman[176348]: 2026-01-22 22:08:18.178336915 +0000 UTC m=+0.099830739 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:08:18 compute-0 sudo[176487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuwdtysnhmhxrsszwkmmypwqsggiilpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119698.0762959-2516-38736609258711/AnsiballZ_getent.py'
Jan 22 22:08:18 compute-0 sudo[176487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:18 compute-0 python3.9[176489]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 22 22:08:18 compute-0 sudo[176487]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:19 compute-0 sudo[176641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drlldxppqeqhzgqiwsplsumqfvqhipjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119698.999902-2540-223631067628236/AnsiballZ_group.py'
Jan 22 22:08:19 compute-0 sudo[176641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:19 compute-0 python3.9[176643]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 22:08:19 compute-0 groupadd[176644]: group added to /etc/group: name=nova, GID=42436
Jan 22 22:08:19 compute-0 groupadd[176644]: group added to /etc/gshadow: name=nova
Jan 22 22:08:19 compute-0 groupadd[176644]: new group: name=nova, GID=42436
Jan 22 22:08:19 compute-0 sudo[176641]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:20 compute-0 sudo[176800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcorcguxcsewsvccxqydmzocefukufek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119699.982186-2564-193731655370587/AnsiballZ_user.py'
Jan 22 22:08:20 compute-0 sudo[176800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:20 compute-0 python3.9[176802]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 22:08:20 compute-0 useradd[176804]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 22 22:08:20 compute-0 useradd[176804]: add 'nova' to group 'libvirt'
Jan 22 22:08:20 compute-0 useradd[176804]: add 'nova' to shadow group 'libvirt'
Jan 22 22:08:21 compute-0 sudo[176800]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:22 compute-0 sshd-session[176835]: Accepted publickey for zuul from 192.168.122.30 port 36682 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 22:08:22 compute-0 systemd-logind[801]: New session 24 of user zuul.
Jan 22 22:08:22 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 22 22:08:22 compute-0 sshd-session[176835]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 22:08:22 compute-0 sshd-session[176838]: Received disconnect from 192.168.122.30 port 36682:11: disconnected by user
Jan 22 22:08:22 compute-0 sshd-session[176838]: Disconnected from user zuul 192.168.122.30 port 36682
Jan 22 22:08:22 compute-0 sshd-session[176835]: pam_unix(sshd:session): session closed for user zuul
Jan 22 22:08:22 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 22 22:08:22 compute-0 systemd-logind[801]: Session 24 logged out. Waiting for processes to exit.
Jan 22 22:08:22 compute-0 systemd-logind[801]: Removed session 24.
Jan 22 22:08:22 compute-0 python3.9[176988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:23 compute-0 python3.9[177109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119702.4362123-2639-173294353708041/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:24 compute-0 python3.9[177259]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:24 compute-0 sshd-session[176567]: Connection reset by 205.210.31.83 port 58338 [preauth]
Jan 22 22:08:24 compute-0 python3.9[177335]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:25 compute-0 python3.9[177485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:25 compute-0 python3.9[177606]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119704.9196327-2639-274764967360813/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:26 compute-0 python3.9[177756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:27 compute-0 python3.9[177877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119706.181241-2639-101361142288708/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:28 compute-0 python3.9[178027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:28 compute-0 python3.9[178148]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119707.6862175-2639-97510858272070/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:29 compute-0 python3.9[178298]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:29 compute-0 python3.9[178419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119708.8695717-2639-160135284640512/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:30 compute-0 sudo[178569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzsgxoajsimkggtizkjsisecoauyfudw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119710.334257-2888-269083752225158/AnsiballZ_file.py'
Jan 22 22:08:30 compute-0 sudo[178569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:30 compute-0 python3.9[178571]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:08:30 compute-0 sudo[178569]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:31 compute-0 sudo[178721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xctjzznabujfeyfodktswveqoayidovh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119711.1190267-2912-157179262692171/AnsiballZ_copy.py'
Jan 22 22:08:31 compute-0 sudo[178721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:31 compute-0 python3.9[178723]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:08:31 compute-0 sudo[178721]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:32 compute-0 sudo[178873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nixcmgcblnowcpemyxrdrmysbegkfaxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119711.8527856-2936-200708192115919/AnsiballZ_stat.py'
Jan 22 22:08:32 compute-0 sudo[178873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:32 compute-0 python3.9[178875]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:08:32 compute-0 sudo[178873]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:32 compute-0 sudo[179025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mheztxetqgninekacbsgshltmunrabgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119712.591782-2960-158172534328823/AnsiballZ_stat.py'
Jan 22 22:08:32 compute-0 sudo[179025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:33 compute-0 python3.9[179027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:33 compute-0 sudo[179025]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:33 compute-0 sudo[179148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xposvlkqujlbuvxlmtzsdutmviivhbxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119712.591782-2960-158172534328823/AnsiballZ_copy.py'
Jan 22 22:08:33 compute-0 sudo[179148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:33 compute-0 python3.9[179150]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769119712.591782-2960-158172534328823/.source _original_basename=.zket9ysl follow=False checksum=0d290c9977d33a7e9377a5d171958e1c9f242552 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 22 22:08:33 compute-0 sudo[179148]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:34 compute-0 python3.9[179302]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:08:35 compute-0 python3.9[179454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:36 compute-0 python3.9[179575]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119715.0758872-3038-265989974832969/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:36 compute-0 python3.9[179725]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:08:37 compute-0 python3.9[179846]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119716.298986-3083-1105818256194/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:08:38 compute-0 sudo[179996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzixwqspinmrdlzotejmzrtsoaadgtvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119717.8678699-3134-67782694018056/AnsiballZ_container_config_data.py'
Jan 22 22:08:38 compute-0 sudo[179996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:38 compute-0 python3.9[179998]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 22 22:08:38 compute-0 sudo[179996]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:39 compute-0 sudo[180148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmspydlwagrlpncdcgqdnxyfckynafhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119719.1326902-3167-180548555441105/AnsiballZ_container_config_hash.py'
Jan 22 22:08:39 compute-0 sudo[180148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:39 compute-0 python3.9[180150]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 22:08:39 compute-0 sudo[180148]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:40 compute-0 sudo[180300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iabbctddmuarmraafbwvbhcnporspxtf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119720.2615113-3197-206548059939904/AnsiballZ_edpm_container_manage.py'
Jan 22 22:08:40 compute-0 sudo[180300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:41 compute-0 python3[180302]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 22:08:41 compute-0 podman[180342]: 2026-01-22 22:08:41.300457315 +0000 UTC m=+0.074896582 container create d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, container_name=nova_compute_init, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 22:08:41 compute-0 podman[180342]: 2026-01-22 22:08:41.267920585 +0000 UTC m=+0.042359842 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 22:08:41 compute-0 python3[180302]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 22 22:08:41 compute-0 sudo[180300]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:42 compute-0 sudo[180540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlzojeahlbersomyibjzaweloonvxfsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119721.6957967-3221-194748703824522/AnsiballZ_stat.py'
Jan 22 22:08:42 compute-0 sudo[180540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:42 compute-0 podman[180504]: 2026-01-22 22:08:42.062854321 +0000 UTC m=+0.124490368 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:08:42 compute-0 python3.9[180551]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:08:42 compute-0 sudo[180540]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:43 compute-0 sudo[180710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofbgamhtslrezqkitqirmsbguclcgubz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119722.8498044-3257-93278646172619/AnsiballZ_container_config_data.py'
Jan 22 22:08:43 compute-0 sudo[180710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:43 compute-0 python3.9[180712]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 22 22:08:43 compute-0 sudo[180710]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:44 compute-0 sudo[180862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxfjwzustlnkahlpduyrnltzfvpmfnwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119723.8719523-3290-273076338381823/AnsiballZ_container_config_hash.py'
Jan 22 22:08:44 compute-0 sudo[180862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:44 compute-0 python3.9[180864]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 22:08:44 compute-0 sudo[180862]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:45 compute-0 sudo[181014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmsqcjspugzzqifrkhvmllyfguwhulgm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119724.8206322-3320-197559807522757/AnsiballZ_edpm_container_manage.py'
Jan 22 22:08:45 compute-0 sudo[181014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:45 compute-0 python3[181016]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 22:08:45 compute-0 podman[181054]: 2026-01-22 22:08:45.578036125 +0000 UTC m=+0.034794899 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 22:08:46 compute-0 podman[181054]: 2026-01-22 22:08:46.024398075 +0000 UTC m=+0.481156769 container create 925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, container_name=nova_compute)
Jan 22 22:08:46 compute-0 python3[181016]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 22 22:08:46 compute-0 sudo[181014]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:46 compute-0 sudo[181243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfujnemxboerrhqxltdmkfumargyccbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119726.3518-3344-196929828521894/AnsiballZ_stat.py'
Jan 22 22:08:46 compute-0 sudo[181243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:46 compute-0 python3.9[181245]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:08:46 compute-0 sudo[181243]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:47 compute-0 sudo[181397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppdcdotvcbbaxcoymiyfbyekhxssxlnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119727.290399-3371-264458923380000/AnsiballZ_file.py'
Jan 22 22:08:47 compute-0 sudo[181397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:47 compute-0 python3.9[181399]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:08:47 compute-0 sudo[181397]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:48 compute-0 sudo[181548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grknagthtxfcxrqbapjszyqyqgbupdbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119727.8568208-3371-71507867288600/AnsiballZ_copy.py'
Jan 22 22:08:48 compute-0 sudo[181548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:48 compute-0 podman[181550]: 2026-01-22 22:08:48.366933144 +0000 UTC m=+0.083476272 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 22:08:48 compute-0 python3.9[181551]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119727.8568208-3371-71507867288600/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:08:48 compute-0 sudo[181548]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:48 compute-0 sudo[181642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kevlugotlorrjgvwwdshvrjztinzehgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119727.8568208-3371-71507867288600/AnsiballZ_systemd.py'
Jan 22 22:08:48 compute-0 sudo[181642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:49 compute-0 python3.9[181644]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:08:49 compute-0 systemd[1]: Reloading.
Jan 22 22:08:49 compute-0 systemd-rc-local-generator[181671]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:08:49 compute-0 systemd-sysv-generator[181674]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:08:49 compute-0 sudo[181642]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:49 compute-0 sudo[181752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzomzbuvnlqlsshmfxunenswaxtbsmnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119727.8568208-3371-71507867288600/AnsiballZ_systemd.py'
Jan 22 22:08:49 compute-0 sudo[181752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:50 compute-0 python3.9[181754]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:08:50 compute-0 systemd[1]: Reloading.
Jan 22 22:08:50 compute-0 systemd-sysv-generator[181787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:08:50 compute-0 systemd-rc-local-generator[181784]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:08:50 compute-0 systemd[1]: Starting nova_compute container...
Jan 22 22:08:50 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:50 compute-0 podman[181794]: 2026-01-22 22:08:50.677353013 +0000 UTC m=+0.256203609 container init 925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Jan 22 22:08:50 compute-0 podman[181794]: 2026-01-22 22:08:50.684080805 +0000 UTC m=+0.262931391 container start 925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute)
Jan 22 22:08:50 compute-0 nova_compute[181809]: + sudo -E kolla_set_configs
Jan 22 22:08:50 compute-0 podman[181794]: nova_compute
Jan 22 22:08:50 compute-0 systemd[1]: Started nova_compute container.
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Validating config file
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying service configuration files
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Deleting /etc/ceph
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Creating directory /etc/ceph
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Writing out command to execute
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:50 compute-0 nova_compute[181809]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 22:08:50 compute-0 nova_compute[181809]: ++ cat /run_command
Jan 22 22:08:50 compute-0 nova_compute[181809]: + CMD=nova-compute
Jan 22 22:08:50 compute-0 nova_compute[181809]: + ARGS=
Jan 22 22:08:50 compute-0 nova_compute[181809]: + sudo kolla_copy_cacerts
Jan 22 22:08:50 compute-0 sudo[181752]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:50 compute-0 nova_compute[181809]: + [[ ! -n '' ]]
Jan 22 22:08:50 compute-0 nova_compute[181809]: + . kolla_extend_start
Jan 22 22:08:50 compute-0 nova_compute[181809]: Running command: 'nova-compute'
Jan 22 22:08:50 compute-0 nova_compute[181809]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 22:08:50 compute-0 nova_compute[181809]: + umask 0022
Jan 22 22:08:50 compute-0 nova_compute[181809]: + exec nova-compute
Jan 22 22:08:52 compute-0 python3.9[181971]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:08:52 compute-0 nova_compute[181809]: 2026-01-22 22:08:52.771 181813 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 22:08:52 compute-0 nova_compute[181809]: 2026-01-22 22:08:52.771 181813 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 22:08:52 compute-0 nova_compute[181809]: 2026-01-22 22:08:52.772 181813 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 22:08:52 compute-0 nova_compute[181809]: 2026-01-22 22:08:52.772 181813 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 22 22:08:52 compute-0 nova_compute[181809]: 2026-01-22 22:08:52.911 181813 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:08:52 compute-0 nova_compute[181809]: 2026-01-22 22:08:52.929 181813 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:08:52 compute-0 nova_compute[181809]: 2026-01-22 22:08:52.930 181813 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 22:08:52 compute-0 python3.9[182123]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.443 181813 INFO nova.virt.driver [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.569 181813 INFO nova.compute.provider_config [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.582 181813 DEBUG oslo_concurrency.lockutils [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.583 181813 DEBUG oslo_concurrency.lockutils [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.583 181813 DEBUG oslo_concurrency.lockutils [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.584 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.584 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.584 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.584 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.584 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.584 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.585 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.585 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.585 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.585 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.585 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.586 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.586 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.586 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.586 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.586 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.587 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.587 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.587 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.587 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.587 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.588 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.588 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.588 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.588 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.588 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.589 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.589 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.589 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.589 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.590 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.590 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.590 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.590 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.590 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.591 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.591 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.591 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.591 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.592 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.592 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.592 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.592 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.592 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.593 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.593 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.593 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.593 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.593 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.594 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.594 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.594 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.594 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.594 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.595 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.595 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.595 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.595 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.595 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.595 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.596 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.596 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.596 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.596 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.597 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.597 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.597 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.597 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.597 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.597 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.598 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.598 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.598 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.598 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.598 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.599 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.599 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.599 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.599 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.599 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.600 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.600 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.600 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.600 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.601 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.601 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.601 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.601 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.601 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.602 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.602 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.602 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.602 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.602 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.603 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.603 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.603 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.603 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.603 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.603 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.604 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.604 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.604 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.604 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.604 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.605 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.605 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.605 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.605 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.605 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.606 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.606 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.606 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.606 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.606 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.607 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.607 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.607 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.607 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.607 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.608 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.608 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.608 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.608 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.608 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.609 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.609 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.609 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.609 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.609 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.610 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.610 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.610 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.610 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.610 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.611 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.611 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.611 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.611 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.611 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.611 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.612 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.612 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.612 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.612 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.612 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.613 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.613 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.613 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.613 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.614 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.614 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.614 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.614 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.614 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.615 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.615 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.615 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.615 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.615 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.616 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.616 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.616 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.616 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.616 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.617 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.617 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.617 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.617 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.617 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.618 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.618 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.618 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.618 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.618 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.619 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.619 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.619 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.619 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.619 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.620 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.620 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.620 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.620 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.620 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.621 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.621 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.621 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.621 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.622 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.622 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.622 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.622 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.622 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.623 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.623 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.623 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.623 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.623 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.624 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.624 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.624 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.624 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.624 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.625 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.625 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.625 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.625 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.625 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.625 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.626 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.626 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.626 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.626 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.626 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.627 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.627 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.627 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.627 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.627 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.628 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.628 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.628 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.628 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.628 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.628 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.629 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.629 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.629 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.629 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.629 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.630 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.630 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.630 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.630 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.630 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.631 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.631 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.631 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.631 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.631 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.632 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.632 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.632 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.632 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.632 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.632 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.633 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.633 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.633 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.633 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.633 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.634 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.634 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.634 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.634 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.634 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.635 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.635 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.635 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.635 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.635 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.636 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.636 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.636 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.636 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.636 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.637 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.637 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.637 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.637 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.637 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.638 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.638 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.638 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.638 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.638 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.639 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.639 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.639 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.639 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.639 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.640 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.640 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.640 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.640 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.640 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.640 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.641 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.641 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.641 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.641 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.642 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.642 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.642 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.642 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.642 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.642 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.643 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.643 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.643 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.643 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.643 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.644 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.644 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.644 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.644 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.644 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.645 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.645 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.645 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.645 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.645 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.646 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.646 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.646 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.646 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.646 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.647 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.647 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.647 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.647 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.647 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.648 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.648 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.648 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.648 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.648 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.648 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.649 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.649 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.649 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.649 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.649 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.650 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.650 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.650 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.650 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.650 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.651 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.651 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.651 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.651 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.651 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.652 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.652 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.652 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.652 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.652 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.653 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.653 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.653 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.653 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.653 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.654 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.654 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.654 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.654 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.655 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.655 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.655 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.655 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.655 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.656 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.656 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.656 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.656 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.656 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.657 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.657 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.657 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.657 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.657 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.658 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.658 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.658 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.658 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.658 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.658 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.659 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.659 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.659 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.659 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.659 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.660 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.660 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.660 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.660 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.660 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.661 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.661 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.661 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.661 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.661 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.662 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.662 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.662 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.662 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.662 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.663 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.663 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.663 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.663 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.663 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.663 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.664 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.664 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.664 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.664 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.664 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.664 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.665 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.665 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.665 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.665 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.665 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.666 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.666 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.666 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.666 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.666 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.667 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.667 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.667 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.667 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.667 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.668 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.668 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.668 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.668 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.668 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.668 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.669 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.669 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.669 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.669 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.669 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.670 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.670 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.670 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.670 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.670 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.671 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.671 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.671 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.671 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.671 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.672 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.672 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.672 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.672 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.673 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.673 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.673 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.673 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.673 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.674 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.674 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.674 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.674 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.674 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.675 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.675 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.675 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.675 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.675 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.675 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.676 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.676 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.676 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.676 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.676 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.676 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.676 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.677 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.677 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.677 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.677 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.677 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.677 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.678 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.678 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.678 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.678 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.678 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.679 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.679 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.679 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.679 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.680 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.680 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.680 181813 WARNING oslo_config.cfg [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 22:08:53 compute-0 nova_compute[181809]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 22:08:53 compute-0 nova_compute[181809]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 22:08:53 compute-0 nova_compute[181809]: and ``live_migration_inbound_addr`` respectively.
Jan 22 22:08:53 compute-0 nova_compute[181809]: ).  Its value may be silently ignored in the future.
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.680 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.681 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.681 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.681 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.681 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.681 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.682 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.682 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.682 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.682 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.682 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.683 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.683 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.683 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.683 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.683 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.684 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.684 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.684 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.684 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.684 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.684 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.685 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.685 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.685 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.685 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.685 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.686 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.686 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.686 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.686 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.687 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.687 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.687 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.687 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.687 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.687 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.688 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.688 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.688 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.688 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.688 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.689 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.689 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.689 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.689 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.689 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.690 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.690 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.690 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.690 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.691 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.691 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.691 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.691 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.691 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.691 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.692 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.692 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.692 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.692 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.692 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.693 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.693 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.693 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.693 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.693 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.693 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.694 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.694 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.694 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.694 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.694 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.695 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.695 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.695 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.695 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.695 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.696 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.696 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.696 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.696 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.696 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.697 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.697 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.697 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.697 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.697 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.698 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.698 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.698 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.698 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.698 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.699 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.699 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.699 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.699 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.699 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.700 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.700 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.700 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.700 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.701 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.701 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.701 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.701 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.701 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.702 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.702 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.702 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.702 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.702 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.702 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.703 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.703 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.703 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.703 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.703 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.704 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.704 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.704 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.704 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.705 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.705 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.705 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.705 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.705 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.706 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.706 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.706 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.706 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.706 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.707 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.707 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.707 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.707 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.707 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.708 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.708 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.708 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.708 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.709 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.709 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.709 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.709 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.709 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.710 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.710 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.710 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.710 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.710 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.711 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.711 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.711 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.711 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.711 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.712 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.712 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.712 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.712 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.712 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.713 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.713 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.713 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.713 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.713 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.714 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.714 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.714 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.714 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.714 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.714 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.715 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.715 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.715 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.715 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.715 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.716 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.716 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.716 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.716 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.717 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.717 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.717 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.717 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.717 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.717 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.718 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.718 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.718 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.718 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.718 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.719 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.719 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.719 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.719 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.719 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.720 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.720 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.720 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.720 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.720 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.720 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.721 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.721 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.721 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.721 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.721 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.722 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.722 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.722 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.722 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.722 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.723 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.723 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.723 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.723 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.723 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.724 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.724 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.724 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.724 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.724 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.724 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.725 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.725 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.725 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.725 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.725 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.725 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.726 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.726 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.726 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.726 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.726 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.726 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.726 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.727 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.727 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.727 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.727 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.727 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.728 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.728 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.728 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.728 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.728 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.729 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.729 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.729 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.729 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.730 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.730 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.730 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.730 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.730 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.731 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.731 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.731 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.731 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.731 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.731 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.732 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.732 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.732 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.732 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.732 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.733 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.733 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.733 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.733 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.733 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.733 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.734 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.734 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.734 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.734 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.734 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.734 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.735 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.735 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.735 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.735 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.735 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.735 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.736 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.736 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.736 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.736 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.736 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.736 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.736 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.737 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.737 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.737 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.737 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.737 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.737 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.738 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.738 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.738 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.738 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.738 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.738 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.739 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.739 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.739 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.739 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.739 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.740 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.740 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.740 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.740 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.740 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.740 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.740 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.741 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.741 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.741 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.741 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.741 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.742 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.742 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.742 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.742 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.742 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.742 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.743 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.743 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.743 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.743 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.743 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.743 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.743 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.744 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.744 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.744 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.744 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.744 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.744 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.745 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.745 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.745 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.745 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.746 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.746 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.746 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.746 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.746 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.746 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.747 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.747 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.747 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.747 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.747 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.747 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.747 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.748 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.748 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.748 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.748 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.748 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.749 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.749 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.749 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.749 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.749 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.749 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.750 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.750 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.750 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.750 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.750 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.751 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.751 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.751 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.751 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.751 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.752 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.752 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.752 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.752 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.752 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.753 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.753 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.753 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.753 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.753 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.753 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.754 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.754 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.754 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.754 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.754 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.755 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.755 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.755 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.755 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.755 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.756 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.756 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.756 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.756 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.756 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.756 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.757 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.757 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.757 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.757 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.757 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.758 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.758 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.758 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.758 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.758 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.759 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.759 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.759 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.759 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.759 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.760 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.760 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.760 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.760 181813 DEBUG oslo_service.service [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.762 181813 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 22 22:08:53 compute-0 python3.9[182275]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.774 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.775 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.775 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.776 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 22 22:08:53 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 22:08:53 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.864 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2f4f9661c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.867 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2f4f9661c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.868 181813 INFO nova.virt.libvirt.driver [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Connection event '1' reason 'None'
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.881 181813 WARNING nova.virt.libvirt.driver [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 22 22:08:53 compute-0 nova_compute[181809]: 2026-01-22 22:08:53.882 181813 DEBUG nova.virt.libvirt.volume.mount [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 22 22:08:54 compute-0 sudo[182485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmwfimmvujvtsyoraffydvpibmrwnzso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119734.0784886-3551-227396504497854/AnsiballZ_podman_container.py'
Jan 22 22:08:54 compute-0 sudo[182485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:54 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.711 181813 INFO nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]: 
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <host>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <uuid>148e2083-b3dc-4db0-b189-a79547a2be98</uuid>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <arch>x86_64</arch>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model>EPYC-Rome-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <vendor>AMD</vendor>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <microcode version='16777317'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <signature family='23' model='49' stepping='0'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='x2apic'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='tsc-deadline'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='osxsave'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='hypervisor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='tsc_adjust'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='spec-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='stibp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='arch-capabilities'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='cmp_legacy'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='topoext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='virt-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='lbrv'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='tsc-scale'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='vmcb-clean'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='pause-filter'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='pfthreshold'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='svme-addr-chk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='rdctl-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='skip-l1dfl-vmentry'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='mds-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature name='pschange-mc-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <pages unit='KiB' size='4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <pages unit='KiB' size='2048'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <pages unit='KiB' size='1048576'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <power_management>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <suspend_mem/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <suspend_disk/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <suspend_hybrid/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </power_management>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <iommu support='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <migration_features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <live/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <uri_transports>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <uri_transport>tcp</uri_transport>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <uri_transport>rdma</uri_transport>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </uri_transports>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </migration_features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <topology>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <cells num='1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <cell id='0'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           <memory unit='KiB'>7864316</memory>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           <pages unit='KiB' size='2048'>0</pages>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           <distances>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <sibling id='0' value='10'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           </distances>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           <cpus num='8'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:           </cpus>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         </cell>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </cells>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </topology>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <cache>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </cache>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <secmodel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model>selinux</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <doi>0</doi>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </secmodel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <secmodel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model>dac</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <doi>0</doi>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </secmodel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </host>
Jan 22 22:08:54 compute-0 nova_compute[181809]: 
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <guest>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <os_type>hvm</os_type>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <arch name='i686'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <wordsize>32</wordsize>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <domain type='qemu'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <domain type='kvm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </arch>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <pae/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <nonpae/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <acpi default='on' toggle='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <apic default='on' toggle='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <cpuselection/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <deviceboot/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <disksnapshot default='on' toggle='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <externalSnapshot/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </guest>
Jan 22 22:08:54 compute-0 nova_compute[181809]: 
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <guest>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <os_type>hvm</os_type>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <arch name='x86_64'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <wordsize>64</wordsize>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <domain type='qemu'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <domain type='kvm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </arch>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <acpi default='on' toggle='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <apic default='on' toggle='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <cpuselection/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <deviceboot/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <disksnapshot default='on' toggle='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <externalSnapshot/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </guest>
Jan 22 22:08:54 compute-0 nova_compute[181809]: 
Jan 22 22:08:54 compute-0 nova_compute[181809]: </capabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]: 
Jan 22 22:08:54 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.719 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 22:08:54 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.745 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 22:08:54 compute-0 nova_compute[181809]: <domainCapabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <domain>kvm</domain>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <arch>i686</arch>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <vcpu max='4096'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <iothreads supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <os supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <enum name='firmware'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <loader supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>rom</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pflash</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='readonly'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>yes</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='secure'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </loader>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </os>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='maximum' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='maximumMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-model' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <vendor>AMD</vendor>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='x2apic'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='stibp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='succor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lbrv'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='custom' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Dhyana-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='KnightsMill'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='KnightsMill-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='athlon'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='athlon-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='core2duo'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='core2duo-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='coreduo'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='coreduo-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='n270'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='n270-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='phenom'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='phenom-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <memoryBacking supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <enum name='sourceType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>file</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>anonymous</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>memfd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </memoryBacking>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <devices>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <disk supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='diskDevice'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>disk</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>cdrom</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>floppy</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>lun</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>fdc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>sata</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </disk>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <graphics supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vnc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>egl-headless</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </graphics>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <video supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='modelType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vga</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>cirrus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>none</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>bochs</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>ramfb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </video>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <hostdev supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='mode'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>subsystem</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='startupPolicy'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>mandatory</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>requisite</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>optional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='subsysType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pci</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='capsType'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='pciBackend'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </hostdev>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <rng supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>random</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>egd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </rng>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <filesystem supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='driverType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>path</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>handle</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtiofs</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </filesystem>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <tpm supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tpm-tis</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tpm-crb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>emulator</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>external</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendVersion'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>2.0</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </tpm>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <redirdev supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </redirdev>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <channel supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </channel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <crypto supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>qemu</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </crypto>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <interface supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>passt</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </interface>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <panic supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>isa</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>hyperv</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </panic>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <console supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>null</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dev</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>file</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pipe</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>stdio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>udp</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tcp</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>qemu-vdagent</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </console>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </devices>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <gic supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <vmcoreinfo supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <genid supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <backingStoreInput supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <backup supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <async-teardown supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <s390-pv supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <ps2 supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <tdx supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <sev supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <sgx supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <hyperv supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='features'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>relaxed</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vapic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>spinlocks</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vpindex</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>runtime</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>synic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>stimer</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>reset</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vendor_id</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>frequencies</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>reenlightenment</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tlbflush</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>ipi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>avic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>emsr_bitmap</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>xmm_input</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <defaults>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <spinlocks>4095</spinlocks>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <stimer_direct>on</stimer_direct>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </defaults>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </hyperv>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <launchSecurity supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </features>
Jan 22 22:08:54 compute-0 nova_compute[181809]: </domainCapabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:08:54 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.757 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 22:08:54 compute-0 nova_compute[181809]: <domainCapabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <domain>kvm</domain>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <arch>i686</arch>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <vcpu max='240'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <iothreads supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <os supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <enum name='firmware'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <loader supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>rom</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pflash</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='readonly'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>yes</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='secure'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </loader>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </os>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='maximum' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='maximumMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-model' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <vendor>AMD</vendor>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='x2apic'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='stibp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='succor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lbrv'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='custom' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Dhyana-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 python3.9[182487]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='KnightsMill'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='KnightsMill-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='athlon'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='athlon-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='core2duo'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='core2duo-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='coreduo'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='coreduo-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='n270'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='n270-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='phenom'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='phenom-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <memoryBacking supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <enum name='sourceType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>file</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>anonymous</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>memfd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </memoryBacking>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <devices>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <disk supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='diskDevice'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>disk</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>cdrom</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>floppy</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>lun</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>ide</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>fdc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>sata</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </disk>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <graphics supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vnc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>egl-headless</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </graphics>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <video supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='modelType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vga</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>cirrus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>none</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>bochs</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>ramfb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </video>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <hostdev supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='mode'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>subsystem</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='startupPolicy'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>mandatory</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>requisite</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>optional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='subsysType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pci</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='capsType'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='pciBackend'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </hostdev>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <rng supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>random</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>egd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </rng>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <filesystem supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='driverType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>path</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>handle</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtiofs</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </filesystem>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <tpm supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tpm-tis</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tpm-crb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>emulator</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>external</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendVersion'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>2.0</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </tpm>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <redirdev supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </redirdev>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <channel supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </channel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <crypto supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>qemu</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </crypto>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <interface supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>passt</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </interface>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <panic supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>isa</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>hyperv</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </panic>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <console supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>null</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dev</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>file</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pipe</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>stdio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>udp</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tcp</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>qemu-vdagent</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </console>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </devices>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <gic supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <vmcoreinfo supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <genid supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <backingStoreInput supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <backup supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <async-teardown supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <s390-pv supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <ps2 supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <tdx supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <sev supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <sgx supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <hyperv supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='features'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>relaxed</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vapic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>spinlocks</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vpindex</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>runtime</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>synic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>stimer</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>reset</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vendor_id</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>frequencies</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>reenlightenment</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tlbflush</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>ipi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>avic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>emsr_bitmap</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>xmm_input</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <defaults>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <spinlocks>4095</spinlocks>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <stimer_direct>on</stimer_direct>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </defaults>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </hyperv>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <launchSecurity supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </features>
Jan 22 22:08:54 compute-0 nova_compute[181809]: </domainCapabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:08:54 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.815 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 22:08:54 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.821 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 22:08:54 compute-0 nova_compute[181809]: <domainCapabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <domain>kvm</domain>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <arch>x86_64</arch>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <vcpu max='4096'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <iothreads supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <os supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <enum name='firmware'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>efi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <loader supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>rom</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pflash</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='readonly'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>yes</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='secure'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>yes</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </loader>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </os>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='maximum' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='maximumMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-model' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <vendor>AMD</vendor>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='x2apic'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='stibp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='succor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lbrv'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='custom' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Denverton-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Dhyana-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 sudo[182485]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='EPYC-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Haswell-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='KnightsMill'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='KnightsMill-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='athlon'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='athlon-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='core2duo'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='core2duo-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='coreduo'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='coreduo-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='n270'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='n270-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='phenom'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='phenom-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <memoryBacking supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <enum name='sourceType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>file</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>anonymous</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>memfd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </memoryBacking>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <devices>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <disk supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='diskDevice'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>disk</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>cdrom</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>floppy</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>lun</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>fdc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>sata</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </disk>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <graphics supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vnc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>egl-headless</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </graphics>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <video supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='modelType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vga</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>cirrus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>none</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>bochs</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>ramfb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </video>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <hostdev supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='mode'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>subsystem</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='startupPolicy'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>mandatory</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>requisite</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>optional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='subsysType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pci</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='capsType'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='pciBackend'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </hostdev>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <rng supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>random</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>egd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </rng>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <filesystem supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='driverType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>path</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>handle</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>virtiofs</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </filesystem>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <tpm supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tpm-tis</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tpm-crb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>emulator</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>external</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendVersion'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>2.0</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </tpm>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <redirdev supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </redirdev>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <channel supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </channel>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <crypto supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>qemu</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </crypto>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <interface supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='backendType'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>passt</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </interface>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <panic supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>isa</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>hyperv</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </panic>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <console supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>null</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vc</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dev</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>file</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pipe</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>stdio</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>udp</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tcp</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>qemu-vdagent</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </console>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </devices>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <features>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <gic supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <vmcoreinfo supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <genid supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <backingStoreInput supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <backup supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <async-teardown supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <s390-pv supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <ps2 supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <tdx supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <sev supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <sgx supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <hyperv supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='features'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>relaxed</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vapic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>spinlocks</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vpindex</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>runtime</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>synic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>stimer</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>reset</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>vendor_id</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>frequencies</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>reenlightenment</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>tlbflush</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>ipi</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>avic</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>emsr_bitmap</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>xmm_input</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <defaults>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <spinlocks>4095</spinlocks>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <stimer_direct>on</stimer_direct>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </defaults>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </hyperv>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <launchSecurity supported='no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </features>
Jan 22 22:08:54 compute-0 nova_compute[181809]: </domainCapabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:08:54 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.910 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 22:08:54 compute-0 nova_compute[181809]: <domainCapabilities>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <domain>kvm</domain>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <arch>x86_64</arch>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <vcpu max='240'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <iothreads supported='yes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <os supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <enum name='firmware'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <loader supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>rom</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>pflash</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='readonly'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>yes</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='secure'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>no</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </loader>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   </os>
Jan 22 22:08:54 compute-0 nova_compute[181809]:   <cpu>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='maximum' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <enum name='maximumMigratable'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>on</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <value>off</value>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='host-model' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <vendor>AMD</vendor>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='x2apic'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='stibp'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='succor'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='ibrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lbrv'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:54 compute-0 nova_compute[181809]:     <mode name='custom' supported='yes'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Broadwell-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bhi-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ddpd-u'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sha512'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm3'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='sm4'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v1'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:08:54 compute-0 nova_compute[181809]:       <blockers model='Cooperlake-v2'>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:54 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Denverton'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Denverton-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Denverton-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Denverton-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Dhyana-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amd-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='auto-ibrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibpb-brtype'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='no-nested-data-bp'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='null-sel-clr-base'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='perfmon-v2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='prefetchi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbpb'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='stibp-always-on'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-v4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='EPYC-v5'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10-128'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10-256'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx10-512'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='prefetchiti'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell-IBRS'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Haswell-v4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='IvyBridge'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='IvyBridge-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='KnightsMill'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='KnightsMill-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-4fmaps'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-4vnniw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512er'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512pf'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fma4'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tbm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xop'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='amx-tile'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-bf16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-fp16'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bitalg'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vbmi2'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrc'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fzrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='la57'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='taa-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='tsx-ldtrk'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SierraForest'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='SierraForest-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ifma'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-ne-convert'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx-vnni-int8'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bhi-ctrl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='bus-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cmpccxadd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fbsdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='fsrs'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ibrs-all'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='intel-psfd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ipred-ctrl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='lam'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mcdt-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pbrsb-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='psdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rrsba-ctrl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='serialize'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vaes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='vpclmulqdq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='hle'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='rtm'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512bw'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512cd'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512dq'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512f'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='avx512vl'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='invpcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pcid'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='pku'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Snowridge'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='mpx'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v2'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v3'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='core-capability'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='split-lock-detect'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='Snowridge-v4'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='cldemote'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='erms'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='gfni'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdir64b'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='movdiri'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='xsaves'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='athlon'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='athlon-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='core2duo'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='core2duo-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='coreduo'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='coreduo-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='n270'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='n270-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='ss'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='phenom'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <blockers model='phenom-v1'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnow'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <feature name='3dnowext'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </blockers>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </mode>
Jan 22 22:08:55 compute-0 nova_compute[181809]:   </cpu>
Jan 22 22:08:55 compute-0 nova_compute[181809]:   <memoryBacking supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <enum name='sourceType'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <value>file</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <value>anonymous</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <value>memfd</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:   </memoryBacking>
Jan 22 22:08:55 compute-0 nova_compute[181809]:   <devices>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <disk supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='diskDevice'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>disk</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>cdrom</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>floppy</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>lun</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>ide</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>fdc</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>sata</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </disk>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <graphics supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>vnc</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>egl-headless</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </graphics>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <video supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='modelType'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>vga</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>cirrus</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>none</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>bochs</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>ramfb</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </video>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <hostdev supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='mode'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>subsystem</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='startupPolicy'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>mandatory</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>requisite</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>optional</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='subsysType'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>pci</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>scsi</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='capsType'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='pciBackend'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </hostdev>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <rng supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio-transitional</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtio-non-transitional</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>random</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>egd</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </rng>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <filesystem supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='driverType'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>path</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>handle</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>virtiofs</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </filesystem>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <tpm supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>tpm-tis</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>tpm-crb</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>emulator</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>external</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='backendVersion'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>2.0</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </tpm>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <redirdev supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='bus'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>usb</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </redirdev>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <channel supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </channel>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <crypto supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='model'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>qemu</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='backendModel'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>builtin</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </crypto>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <interface supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='backendType'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>default</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>passt</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </interface>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <panic supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='model'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>isa</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>hyperv</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </panic>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <console supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='type'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>null</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>vc</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>pty</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>dev</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>file</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>pipe</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>stdio</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>udp</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>tcp</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>unix</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>qemu-vdagent</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>dbus</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </console>
Jan 22 22:08:55 compute-0 nova_compute[181809]:   </devices>
Jan 22 22:08:55 compute-0 nova_compute[181809]:   <features>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <gic supported='no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <vmcoreinfo supported='yes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <genid supported='yes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <backingStoreInput supported='yes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <backup supported='yes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <async-teardown supported='yes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <s390-pv supported='no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <ps2 supported='yes'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <tdx supported='no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <sev supported='no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <sgx supported='no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <hyperv supported='yes'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <enum name='features'>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>relaxed</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>vapic</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>spinlocks</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>vpindex</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>runtime</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>synic</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>stimer</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>reset</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>vendor_id</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>frequencies</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>reenlightenment</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>tlbflush</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>ipi</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>avic</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>emsr_bitmap</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <value>xmm_input</value>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </enum>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       <defaults>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <spinlocks>4095</spinlocks>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <stimer_direct>on</stimer_direct>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:08:55 compute-0 nova_compute[181809]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:08:55 compute-0 nova_compute[181809]:       </defaults>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     </hyperv>
Jan 22 22:08:55 compute-0 nova_compute[181809]:     <launchSecurity supported='no'/>
Jan 22 22:08:55 compute-0 nova_compute[181809]:   </features>
Jan 22 22:08:55 compute-0 nova_compute[181809]: </domainCapabilities>
Jan 22 22:08:55 compute-0 nova_compute[181809]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.991 181813 DEBUG nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.992 181813 INFO nova.virt.libvirt.host [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Secure Boot support detected
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.994 181813 INFO nova.virt.libvirt.driver [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:54.994 181813 INFO nova.virt.libvirt.driver [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.004 181813 DEBUG nova.virt.libvirt.driver [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] cpu compare xml: <cpu match="exact">
Jan 22 22:08:55 compute-0 nova_compute[181809]:   <model>Nehalem</model>
Jan 22 22:08:55 compute-0 nova_compute[181809]: </cpu>
Jan 22 22:08:55 compute-0 nova_compute[181809]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.007 181813 DEBUG nova.virt.libvirt.driver [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.029 181813 INFO nova.virt.node [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Determined node identity 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from /var/lib/nova/compute_id
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.047 181813 WARNING nova.compute.manager [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Compute nodes ['4f7db789-7f4b-4901-9c88-ecf66d0aff43'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.074 181813 INFO nova.compute.manager [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.214 181813 WARNING nova.compute.manager [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.214 181813 DEBUG oslo_concurrency.lockutils [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.214 181813 DEBUG oslo_concurrency.lockutils [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.214 181813 DEBUG oslo_concurrency.lockutils [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.215 181813 DEBUG nova.compute.resource_tracker [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:08:55 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.402 181813 WARNING nova.virt.libvirt.driver [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.404 181813 DEBUG nova.compute.resource_tracker [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6206MB free_disk=73.58867263793945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.404 181813 DEBUG oslo_concurrency.lockutils [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.404 181813 DEBUG oslo_concurrency.lockutils [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.426 181813 WARNING nova.compute.resource_tracker [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] No compute node record for compute-0.ctlplane.example.com:4f7db789-7f4b-4901-9c88-ecf66d0aff43: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 4f7db789-7f4b-4901-9c88-ecf66d0aff43 could not be found.
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.452 181813 INFO nova.compute.resource_tracker [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 4f7db789-7f4b-4901-9c88-ecf66d0aff43
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.499 181813 DEBUG nova.compute.resource_tracker [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.500 181813 DEBUG nova.compute.resource_tracker [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:08:55 compute-0 sudo[182662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrwpqnnolnlhrbdyborgkxscrmrmojbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119735.2618124-3575-267540836299912/AnsiballZ_systemd.py'
Jan 22 22:08:55 compute-0 sudo[182662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:55 compute-0 python3.9[182664]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 22:08:55 compute-0 systemd[1]: Stopping nova_compute container...
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.927 181813 DEBUG oslo_concurrency.lockutils [None req-4e177375-9f17-4138-a2d1-659d2621fd5a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.928 181813 DEBUG oslo_concurrency.lockutils [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.928 181813 DEBUG oslo_concurrency.lockutils [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:08:55 compute-0 nova_compute[181809]: 2026-01-22 22:08:55.928 181813 DEBUG oslo_concurrency.lockutils [None req-fad719be-6d81-427a-b411-13a2a1245061 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:08:56 compute-0 virtqemud[182297]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 22 22:08:56 compute-0 virtqemud[182297]: hostname: compute-0
Jan 22 22:08:56 compute-0 virtqemud[182297]: End of file while reading data: Input/output error
Jan 22 22:08:56 compute-0 systemd[1]: libpod-925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0.scope: Deactivated successfully.
Jan 22 22:08:56 compute-0 systemd[1]: libpod-925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0.scope: Consumed 3.390s CPU time.
Jan 22 22:08:56 compute-0 podman[182668]: 2026-01-22 22:08:56.448529847 +0000 UTC m=+0.564486336 container died 925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:08:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0-userdata-shm.mount: Deactivated successfully.
Jan 22 22:08:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150-merged.mount: Deactivated successfully.
Jan 22 22:08:57 compute-0 podman[182668]: 2026-01-22 22:08:57.416049649 +0000 UTC m=+1.532006138 container cleanup 925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 22:08:57 compute-0 podman[182668]: nova_compute
Jan 22 22:08:57 compute-0 podman[182697]: nova_compute
Jan 22 22:08:57 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 22 22:08:57 compute-0 systemd[1]: Stopped nova_compute container.
Jan 22 22:08:57 compute-0 systemd[1]: Starting nova_compute container...
Jan 22 22:08:57 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:08:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90067a44650122ada9073050452f6cbdce27284b61c54f66f30d321622166150/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:57 compute-0 podman[182710]: 2026-01-22 22:08:57.595415144 +0000 UTC m=+0.091054063 container init 925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute)
Jan 22 22:08:57 compute-0 podman[182710]: 2026-01-22 22:08:57.602240858 +0000 UTC m=+0.097879737 container start 925c076abf2465b6aa4f19daabafc64fc02e11873085414bd6ad3cd10030e5d0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=nova_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:08:57 compute-0 podman[182710]: nova_compute
Jan 22 22:08:57 compute-0 nova_compute[182725]: + sudo -E kolla_set_configs
Jan 22 22:08:57 compute-0 systemd[1]: Started nova_compute container.
Jan 22 22:08:57 compute-0 sudo[182662]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Validating config file
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying service configuration files
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /etc/ceph
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Creating directory /etc/ceph
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Writing out command to execute
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:57 compute-0 nova_compute[182725]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 22:08:57 compute-0 nova_compute[182725]: ++ cat /run_command
Jan 22 22:08:57 compute-0 nova_compute[182725]: + CMD=nova-compute
Jan 22 22:08:57 compute-0 nova_compute[182725]: + ARGS=
Jan 22 22:08:57 compute-0 nova_compute[182725]: + sudo kolla_copy_cacerts
Jan 22 22:08:57 compute-0 nova_compute[182725]: + [[ ! -n '' ]]
Jan 22 22:08:57 compute-0 nova_compute[182725]: + . kolla_extend_start
Jan 22 22:08:57 compute-0 nova_compute[182725]: Running command: 'nova-compute'
Jan 22 22:08:57 compute-0 nova_compute[182725]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 22:08:57 compute-0 nova_compute[182725]: + umask 0022
Jan 22 22:08:57 compute-0 nova_compute[182725]: + exec nova-compute
Jan 22 22:08:58 compute-0 sudo[182886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiphaooorezbwpahyzxhdpjfapmxpgxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119737.9009862-3602-15701616304110/AnsiballZ_podman_container.py'
Jan 22 22:08:58 compute-0 sudo[182886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:08:58 compute-0 python3.9[182888]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 22:08:58 compute-0 systemd[1]: Started libpod-conmon-d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5.scope.
Jan 22 22:08:58 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:08:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3708a51cbb22935e536c75ed23db96be9688754aed3201902c3c5eb141c808f9/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3708a51cbb22935e536c75ed23db96be9688754aed3201902c3c5eb141c808f9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3708a51cbb22935e536c75ed23db96be9688754aed3201902c3c5eb141c808f9/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 22 22:08:58 compute-0 podman[182913]: 2026-01-22 22:08:58.775084338 +0000 UTC m=+0.145921155 container init d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 22:08:58 compute-0 podman[182913]: 2026-01-22 22:08:58.782401265 +0000 UTC m=+0.153238062 container start d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 22:08:58 compute-0 python3.9[182888]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Applying nova statedir ownership
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 22 22:08:58 compute-0 nova_compute_init[182932]: INFO:nova_statedir:Nova statedir ownership complete
Jan 22 22:08:58 compute-0 systemd[1]: libpod-d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5.scope: Deactivated successfully.
Jan 22 22:08:58 compute-0 podman[182947]: 2026-01-22 22:08:58.87938756 +0000 UTC m=+0.029284299 container died d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=nova_compute_init)
Jan 22 22:08:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5-userdata-shm.mount: Deactivated successfully.
Jan 22 22:08:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-3708a51cbb22935e536c75ed23db96be9688754aed3201902c3c5eb141c808f9-merged.mount: Deactivated successfully.
Jan 22 22:08:58 compute-0 podman[182947]: 2026-01-22 22:08:58.912383202 +0000 UTC m=+0.062279911 container cleanup d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:08:58 compute-0 systemd[1]: libpod-conmon-d67d929923ac9a352b4302d5cd5ec2bf4362fd213c610d4d5781c0191e5ad2f5.scope: Deactivated successfully.
Jan 22 22:08:58 compute-0 sudo[182886]: pam_unix(sudo:session): session closed for user root
Jan 22 22:08:59 compute-0 sshd-session[159624]: Connection closed by 192.168.122.30 port 53422
Jan 22 22:08:59 compute-0 sshd-session[159621]: pam_unix(sshd:session): session closed for user zuul
Jan 22 22:08:59 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 22 22:08:59 compute-0 systemd[1]: session-23.scope: Consumed 1min 47.057s CPU time.
Jan 22 22:08:59 compute-0 systemd-logind[801]: Session 23 logged out. Waiting for processes to exit.
Jan 22 22:08:59 compute-0 systemd-logind[801]: Removed session 23.
Jan 22 22:08:59 compute-0 nova_compute[182725]: 2026-01-22 22:08:59.719 182729 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 22:08:59 compute-0 nova_compute[182725]: 2026-01-22 22:08:59.719 182729 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 22:08:59 compute-0 nova_compute[182725]: 2026-01-22 22:08:59.719 182729 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 22 22:08:59 compute-0 nova_compute[182725]: 2026-01-22 22:08:59.720 182729 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 22 22:08:59 compute-0 nova_compute[182725]: 2026-01-22 22:08:59.866 182729 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:08:59 compute-0 nova_compute[182725]: 2026-01-22 22:08:59.877 182729 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:08:59 compute-0 nova_compute[182725]: 2026-01-22 22:08:59.878 182729 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.335 182729 INFO nova.virt.driver [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.448 182729 INFO nova.compute.provider_config [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.455 182729 DEBUG oslo_concurrency.lockutils [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.456 182729 DEBUG oslo_concurrency.lockutils [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.456 182729 DEBUG oslo_concurrency.lockutils [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.456 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.457 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.457 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.457 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.457 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.457 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.458 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.458 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.458 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.458 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.458 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.458 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.458 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.459 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.459 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.459 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.459 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.459 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.459 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.460 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.460 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.460 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.460 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.460 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.461 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.461 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.461 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.461 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.462 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.462 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.462 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.462 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.462 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.463 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.463 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.463 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.463 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.463 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.464 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.464 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.464 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.464 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.464 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.465 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.465 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.465 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.465 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.465 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.466 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.466 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.466 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.466 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.466 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.467 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.467 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.467 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.467 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.467 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.467 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.467 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.468 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.468 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.468 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.468 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.468 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.468 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.468 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.469 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.469 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.469 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.469 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.469 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.469 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.470 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.470 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.470 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.470 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.470 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.470 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.470 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.471 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.471 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.471 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.471 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.471 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.471 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.471 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.472 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.472 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.472 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.472 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.472 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.472 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.473 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.473 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.473 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.473 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.473 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.473 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.473 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.474 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.474 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.474 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.474 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.474 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.474 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.475 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.475 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.475 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.475 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.475 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.475 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.475 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.476 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.476 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.476 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.476 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.476 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.477 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.477 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.477 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.477 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.477 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.478 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.478 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.478 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.478 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.478 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.479 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.479 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.479 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.479 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.479 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.480 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.480 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.480 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.480 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.480 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.480 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.481 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.481 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.481 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.481 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.481 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.481 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.482 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.482 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.482 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.482 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.482 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.483 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.483 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.483 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.483 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.483 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.484 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.484 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.484 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.484 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.484 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.485 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.485 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.485 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.485 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.485 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.486 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.486 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.486 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.486 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.486 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.487 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.487 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.487 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.487 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.487 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.488 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.488 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.488 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.488 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.488 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.489 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.489 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.489 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.489 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.489 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.490 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.490 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.490 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.490 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.490 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.490 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.491 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.491 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.491 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.491 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.491 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.492 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.492 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.492 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.492 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.492 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.492 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.492 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.493 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.493 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.493 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.493 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.493 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.493 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.493 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.494 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.494 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.494 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.494 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.494 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.494 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.494 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.495 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.495 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.495 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.495 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.495 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.495 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.495 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.496 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.496 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.496 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.496 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.496 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.496 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.496 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.497 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.497 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.497 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.497 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.497 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.497 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.498 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.498 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.498 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.498 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.498 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.498 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.498 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.499 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.499 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.499 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.499 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.499 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.499 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.499 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.500 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.500 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.500 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.500 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.500 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.501 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.501 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.501 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.501 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.501 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.501 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.502 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.502 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.502 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.502 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.502 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.502 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.502 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.503 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.503 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.503 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.503 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.503 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.503 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.503 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.504 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.504 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.504 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.504 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.504 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.504 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.504 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.505 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.505 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.505 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.505 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.505 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.505 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.505 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.506 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.506 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.506 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.506 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.506 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.506 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.506 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.507 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.507 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.507 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.507 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.507 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.507 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.507 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.508 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.509 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.509 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.509 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.509 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.509 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.509 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.509 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.510 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.510 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.510 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.510 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.510 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.511 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.512 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.512 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.512 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.512 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.512 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.512 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.513 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.513 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.513 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.513 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.513 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.513 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.513 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.514 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.514 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.514 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.514 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.514 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.514 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.514 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.515 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.515 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.515 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.515 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.516 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.516 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.516 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.516 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.516 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.516 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.517 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.517 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.517 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.517 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.517 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.517 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.518 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.518 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.518 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.518 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.518 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.519 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.519 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.519 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.519 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.519 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.519 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.519 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.520 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.520 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.520 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.520 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.520 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.520 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.521 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.521 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.521 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.521 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.521 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.521 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.521 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.522 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.522 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.522 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.522 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.522 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.522 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.522 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.523 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.523 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.523 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.523 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.523 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.523 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.523 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.524 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.524 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.524 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.524 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.524 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.524 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.524 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.525 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.525 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.525 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.525 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.525 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.525 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.525 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.526 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.526 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.526 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.526 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.526 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.526 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.527 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.527 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.527 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.527 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.527 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.527 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.528 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.528 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.528 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.528 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.528 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.528 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.529 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.529 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.529 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.529 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.529 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.529 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.530 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.530 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.530 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.530 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.530 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.530 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.531 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.531 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.531 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.531 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.531 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.531 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.532 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.532 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.532 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.532 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.532 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.532 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.532 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.533 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.533 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.533 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.533 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.533 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.533 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.534 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.534 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.534 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.534 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.534 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.535 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.535 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.535 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.535 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.535 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.535 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.536 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.536 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.536 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.536 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.536 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.536 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.537 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.537 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.537 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.537 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.538 182729 WARNING oslo_config.cfg [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 22:09:00 compute-0 nova_compute[182725]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 22:09:00 compute-0 nova_compute[182725]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 22:09:00 compute-0 nova_compute[182725]: and ``live_migration_inbound_addr`` respectively.
Jan 22 22:09:00 compute-0 nova_compute[182725]: ).  Its value may be silently ignored in the future.
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.538 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.538 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.538 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.538 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.539 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.539 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.539 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.539 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.539 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.540 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.540 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.540 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.540 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.540 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.541 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.541 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.541 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.541 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.541 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.542 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.542 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.542 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.542 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.542 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.542 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.543 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.543 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.543 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.543 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.543 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.544 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.544 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.544 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.544 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.544 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.544 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.545 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.545 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.545 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.545 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.545 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.545 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.546 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.546 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.546 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.546 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.546 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.547 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.547 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.547 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.547 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.547 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.548 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.548 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.548 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.548 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.548 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.549 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.549 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.549 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.549 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.549 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.550 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.550 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.550 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.550 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.550 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.550 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.551 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.551 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.551 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.551 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.551 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.552 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.552 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.552 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.552 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.552 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.552 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.553 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.553 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.553 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.553 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.553 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.554 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.554 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.554 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.554 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.555 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.555 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.555 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.555 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.555 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.555 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.556 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.556 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.556 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.556 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.556 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.556 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.557 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.557 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.557 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.557 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.557 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.557 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.558 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.558 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.558 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.558 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.558 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.558 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.558 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.559 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.559 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.559 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.559 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.559 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.559 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.559 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.560 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.560 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.560 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.560 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.560 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.561 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.561 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.561 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.561 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.561 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.562 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.562 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.562 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.562 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.562 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.563 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.563 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.563 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.563 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.563 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.564 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.564 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.564 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.564 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.564 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.564 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.565 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.565 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.565 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.565 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.565 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.566 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.566 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.566 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.566 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.566 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.566 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.567 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.567 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.567 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.567 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.567 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.567 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.568 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.568 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.568 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.568 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.568 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.569 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.569 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.569 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.569 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.569 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.570 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.570 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.570 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.570 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.570 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.571 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.571 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.571 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.571 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.571 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.572 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.572 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.572 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.572 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.572 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.573 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.573 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.573 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.573 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.573 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.574 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.574 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.574 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.574 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.575 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.575 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.575 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.575 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.575 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.576 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.576 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.576 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.576 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.576 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.576 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.577 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.577 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.577 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.577 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.577 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.578 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.578 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.578 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.578 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.578 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.578 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.579 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.579 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.579 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.579 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.579 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.579 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.580 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.580 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.580 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.580 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.580 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.580 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.581 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.581 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.581 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.581 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.581 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.581 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.582 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.582 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.582 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.582 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.582 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.582 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.582 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.583 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.583 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.583 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.583 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.583 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.584 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.584 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.584 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.584 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.585 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.585 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.585 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.585 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.585 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.585 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.586 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.586 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.586 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.586 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.586 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.587 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.587 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.587 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.587 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.587 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.587 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.588 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.588 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.588 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.588 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.588 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.589 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.589 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.589 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.589 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.589 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.590 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.590 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.590 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.590 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.590 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.591 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.591 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.591 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.591 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.591 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.592 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.592 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.592 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.592 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.592 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.592 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.593 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.593 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.593 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.593 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.593 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.594 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.594 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.594 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.594 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.594 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.595 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.595 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.595 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.595 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.595 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.596 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.596 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.596 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.596 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.596 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.596 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.597 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.597 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.597 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.597 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.597 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.597 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.598 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.598 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.598 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.598 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.598 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.598 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.599 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.599 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.599 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.599 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.599 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.599 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.599 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.600 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.600 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.600 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.600 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.600 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.600 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.601 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.602 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.602 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.602 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.602 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.602 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.602 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.602 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.603 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.603 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.603 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.603 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.603 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.603 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.603 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.604 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.605 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.605 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.605 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.605 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.605 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.605 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.605 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.606 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.606 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.606 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.606 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.606 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.606 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.606 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.607 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.607 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.607 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.607 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.607 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.607 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.607 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.608 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.608 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.608 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.608 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.608 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.608 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.608 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.609 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.609 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.609 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.609 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.609 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.609 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.609 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.610 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.610 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.610 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.610 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.610 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.610 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.610 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.611 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.611 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.611 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.611 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.611 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.611 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.611 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.612 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.612 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.612 182729 DEBUG oslo_service.service [None req-3c64270b-b1eb-4156-9fce-f9c61c0341e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.613 182729 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.632 182729 INFO nova.virt.node [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Determined node identity 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from /var/lib/nova/compute_id
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.633 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.634 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.634 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.634 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.648 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f6dbd8bcd60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.652 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f6dbd8bcd60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.653 182729 INFO nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Connection event '1' reason 'None'
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.659 182729 INFO nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]: 
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <host>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <uuid>148e2083-b3dc-4db0-b189-a79547a2be98</uuid>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <arch>x86_64</arch>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model>EPYC-Rome-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <vendor>AMD</vendor>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <microcode version='16777317'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <signature family='23' model='49' stepping='0'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='x2apic'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='tsc-deadline'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='osxsave'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='hypervisor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='tsc_adjust'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='spec-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='stibp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='arch-capabilities'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='cmp_legacy'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='topoext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='virt-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='lbrv'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='tsc-scale'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='vmcb-clean'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='pause-filter'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='pfthreshold'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='svme-addr-chk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='rdctl-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='skip-l1dfl-vmentry'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='mds-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature name='pschange-mc-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <pages unit='KiB' size='4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <pages unit='KiB' size='2048'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <pages unit='KiB' size='1048576'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <power_management>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <suspend_mem/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <suspend_disk/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <suspend_hybrid/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </power_management>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <iommu support='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <migration_features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <live/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <uri_transports>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <uri_transport>tcp</uri_transport>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <uri_transport>rdma</uri_transport>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </uri_transports>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </migration_features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <topology>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <cells num='1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <cell id='0'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           <memory unit='KiB'>7864316</memory>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           <pages unit='KiB' size='2048'>0</pages>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           <distances>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <sibling id='0' value='10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           </distances>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           <cpus num='8'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:           </cpus>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         </cell>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </cells>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </topology>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <cache>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </cache>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <secmodel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model>selinux</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <doi>0</doi>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </secmodel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <secmodel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model>dac</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <doi>0</doi>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </secmodel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </host>
Jan 22 22:09:00 compute-0 nova_compute[182725]: 
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <guest>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <os_type>hvm</os_type>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <arch name='i686'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <wordsize>32</wordsize>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <domain type='qemu'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <domain type='kvm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </arch>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <pae/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <nonpae/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <acpi default='on' toggle='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <apic default='on' toggle='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <cpuselection/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <deviceboot/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <disksnapshot default='on' toggle='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <externalSnapshot/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </guest>
Jan 22 22:09:00 compute-0 nova_compute[182725]: 
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <guest>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <os_type>hvm</os_type>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <arch name='x86_64'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <wordsize>64</wordsize>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <domain type='qemu'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <domain type='kvm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </arch>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <acpi default='on' toggle='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <apic default='on' toggle='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <cpuselection/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <deviceboot/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <disksnapshot default='on' toggle='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <externalSnapshot/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </guest>
Jan 22 22:09:00 compute-0 nova_compute[182725]: 
Jan 22 22:09:00 compute-0 nova_compute[182725]: </capabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]: 
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.665 182729 DEBUG nova.virt.libvirt.volume.mount [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.667 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.671 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 22:09:00 compute-0 nova_compute[182725]: <domainCapabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <domain>kvm</domain>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <arch>i686</arch>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <vcpu max='4096'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <iothreads supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <os supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <enum name='firmware'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <loader supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>rom</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pflash</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='readonly'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>yes</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='secure'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </loader>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </os>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='maximum' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='maximumMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-model' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <vendor>AMD</vendor>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='x2apic'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='stibp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='succor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lbrv'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='custom' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Dhyana-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='athlon'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='athlon-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='core2duo'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='core2duo-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='coreduo'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='coreduo-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='n270'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='n270-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='phenom'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='phenom-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <memoryBacking supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <enum name='sourceType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>file</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>anonymous</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>memfd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </memoryBacking>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <disk supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='diskDevice'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>disk</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>cdrom</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>floppy</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>lun</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>fdc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>sata</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <graphics supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vnc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>egl-headless</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <video supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='modelType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vga</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>cirrus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>none</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>bochs</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>ramfb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </video>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <hostdev supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='mode'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>subsystem</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='startupPolicy'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>mandatory</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>requisite</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>optional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='subsysType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pci</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='capsType'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='pciBackend'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </hostdev>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <rng supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>random</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>egd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <filesystem supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='driverType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>path</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>handle</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtiofs</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </filesystem>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <tpm supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tpm-tis</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tpm-crb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>emulator</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>external</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendVersion'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>2.0</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </tpm>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <redirdev supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </redirdev>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <channel supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </channel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <crypto supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>qemu</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </crypto>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <interface supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>passt</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <panic supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>isa</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>hyperv</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </panic>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <console supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>null</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dev</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>file</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pipe</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>stdio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>udp</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tcp</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>qemu-vdagent</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </console>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <gic supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <vmcoreinfo supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <genid supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <backingStoreInput supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <backup supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <async-teardown supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <s390-pv supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <ps2 supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <tdx supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <sev supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <sgx supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <hyperv supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='features'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>relaxed</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vapic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>spinlocks</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vpindex</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>runtime</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>synic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>stimer</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>reset</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vendor_id</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>frequencies</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>reenlightenment</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tlbflush</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>ipi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>avic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>emsr_bitmap</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>xmm_input</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <defaults>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <spinlocks>4095</spinlocks>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <stimer_direct>on</stimer_direct>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </defaults>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </hyperv>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <launchSecurity supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </features>
Jan 22 22:09:00 compute-0 nova_compute[182725]: </domainCapabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.678 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 22:09:00 compute-0 nova_compute[182725]: <domainCapabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <domain>kvm</domain>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <arch>i686</arch>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <vcpu max='240'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <iothreads supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <os supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <enum name='firmware'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <loader supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>rom</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pflash</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='readonly'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>yes</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='secure'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </loader>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </os>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='maximum' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='maximumMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-model' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <vendor>AMD</vendor>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='x2apic'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='stibp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='succor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lbrv'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='custom' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Dhyana-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='athlon'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='athlon-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='core2duo'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='core2duo-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='coreduo'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='coreduo-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='n270'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='n270-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='phenom'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='phenom-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <memoryBacking supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <enum name='sourceType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>file</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>anonymous</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>memfd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </memoryBacking>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <disk supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='diskDevice'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>disk</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>cdrom</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>floppy</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>lun</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>ide</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>fdc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>sata</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <graphics supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vnc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>egl-headless</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <video supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='modelType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vga</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>cirrus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>none</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>bochs</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>ramfb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </video>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <hostdev supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='mode'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>subsystem</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='startupPolicy'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>mandatory</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>requisite</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>optional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='subsysType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pci</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='capsType'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='pciBackend'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </hostdev>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <rng supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>random</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>egd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <filesystem supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='driverType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>path</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>handle</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtiofs</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </filesystem>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <tpm supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tpm-tis</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tpm-crb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>emulator</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>external</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendVersion'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>2.0</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </tpm>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <redirdev supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </redirdev>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <channel supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </channel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <crypto supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>qemu</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </crypto>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <interface supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>passt</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <panic supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>isa</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>hyperv</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </panic>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <console supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>null</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dev</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>file</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pipe</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>stdio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>udp</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tcp</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>qemu-vdagent</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </console>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <gic supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <vmcoreinfo supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <genid supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <backingStoreInput supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <backup supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <async-teardown supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <s390-pv supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <ps2 supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <tdx supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <sev supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <sgx supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <hyperv supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='features'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>relaxed</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vapic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>spinlocks</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vpindex</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>runtime</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>synic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>stimer</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>reset</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vendor_id</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>frequencies</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>reenlightenment</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tlbflush</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>ipi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>avic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>emsr_bitmap</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>xmm_input</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <defaults>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <spinlocks>4095</spinlocks>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <stimer_direct>on</stimer_direct>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </defaults>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </hyperv>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <launchSecurity supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </features>
Jan 22 22:09:00 compute-0 nova_compute[182725]: </domainCapabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.770 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.781 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 22:09:00 compute-0 nova_compute[182725]: <domainCapabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <domain>kvm</domain>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <arch>x86_64</arch>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <vcpu max='4096'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <iothreads supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <os supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <enum name='firmware'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>efi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <loader supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>rom</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pflash</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='readonly'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>yes</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='secure'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>yes</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </loader>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </os>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='maximum' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='maximumMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-model' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <vendor>AMD</vendor>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='x2apic'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='stibp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='succor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lbrv'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='custom' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Dhyana-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='athlon'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='athlon-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='core2duo'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='core2duo-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='coreduo'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='coreduo-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='n270'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='n270-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='phenom'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='phenom-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <memoryBacking supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <enum name='sourceType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>file</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>anonymous</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>memfd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </memoryBacking>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <disk supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='diskDevice'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>disk</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>cdrom</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>floppy</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>lun</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>fdc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>sata</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <graphics supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vnc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>egl-headless</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <video supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='modelType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vga</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>cirrus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>none</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>bochs</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>ramfb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </video>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <hostdev supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='mode'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>subsystem</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='startupPolicy'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>mandatory</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>requisite</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>optional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='subsysType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pci</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='capsType'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='pciBackend'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </hostdev>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <rng supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>random</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>egd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <filesystem supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='driverType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>path</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>handle</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>virtiofs</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </filesystem>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <tpm supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tpm-tis</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tpm-crb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>emulator</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>external</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendVersion'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>2.0</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </tpm>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <redirdev supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </redirdev>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <channel supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </channel>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <crypto supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>qemu</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </crypto>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <interface supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='backendType'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>passt</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <panic supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>isa</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>hyperv</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </panic>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <console supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>null</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vc</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dev</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>file</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pipe</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>stdio</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>udp</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tcp</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>qemu-vdagent</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </console>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <features>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <gic supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <vmcoreinfo supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <genid supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <backingStoreInput supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <backup supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <async-teardown supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <s390-pv supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <ps2 supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <tdx supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <sev supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <sgx supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <hyperv supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='features'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>relaxed</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vapic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>spinlocks</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vpindex</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>runtime</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>synic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>stimer</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>reset</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>vendor_id</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>frequencies</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>reenlightenment</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>tlbflush</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>ipi</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>avic</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>emsr_bitmap</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>xmm_input</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <defaults>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <spinlocks>4095</spinlocks>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <stimer_direct>on</stimer_direct>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </defaults>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </hyperv>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <launchSecurity supported='no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </features>
Jan 22 22:09:00 compute-0 nova_compute[182725]: </domainCapabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:09:00 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.862 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 22:09:00 compute-0 nova_compute[182725]: <domainCapabilities>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <path>/usr/libexec/qemu-kvm</path>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <domain>kvm</domain>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <arch>x86_64</arch>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <vcpu max='240'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <iothreads supported='yes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <os supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <enum name='firmware'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <loader supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>rom</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>pflash</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='readonly'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>yes</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='secure'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>no</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </loader>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   </os>
Jan 22 22:09:00 compute-0 nova_compute[182725]:   <cpu>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-passthrough' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='hostPassthroughMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='maximum' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <enum name='maximumMigratable'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>on</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <value>off</value>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='host-model' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <vendor>AMD</vendor>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='x2apic'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-deadline'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='hypervisor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc_adjust'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='spec-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='stibp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='cmp_legacy'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='overflow-recov'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='succor'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='amd-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='virt-ssbd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lbrv'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='tsc-scale'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='vmcb-clean'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='flushbyasid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pause-filter'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='pfthreshold'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='svme-addr-chk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <feature policy='disable' name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:00 compute-0 nova_compute[182725]:     <mode name='custom' supported='yes'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Broadwell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cascadelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='ClearwaterForest-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bhi-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ddpd-u'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sha512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm3'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sm4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Cooperlake-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Denverton-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Dhyana-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Genoa-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Milan-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Rome-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-Turin-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amd-psfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='auto-ibrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vp2intersect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fs-gs-base-ns'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibpb-brtype'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='no-nested-data-bp'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='null-sel-clr-base'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='perfmon-v2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbpb'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='srso-user-kernel-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='stibp-always-on'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='EPYC-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='GraniteRapids-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-128'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-256'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx10-512'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='prefetchiti'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Haswell-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-noTSX'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v3'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v6'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Icelake-Server-v7'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-IBRS'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='IvyBridge-v2'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='KnightsMill-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4fmaps'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-4vnniw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512er'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512pf'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G4-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='Opteron_G5-v1'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fma4'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tbm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xop'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 22:09:00 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids'>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:00 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v2'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v3'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SapphireRapids-v4'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='amx-tile'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-bf16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-fp16'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512-vpopcntdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bitalg'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vbmi2'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrc'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fzrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='la57'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='taa-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='tsx-ldtrk'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xfd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SierraForest'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v2'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='SierraForest-v3'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ifma'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-ne-convert'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx-vnni-int8'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bhi-ctrl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='bus-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cmpccxadd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fbsdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='fsrs'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ibrs-all'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='intel-psfd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ipred-ctrl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='lam'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='mcdt-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pbrsb-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='psdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rrsba-ctrl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='sbdr-ssdp-no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='serialize'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vaes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='vpclmulqdq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-IBRS'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v2'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v3'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Client-v4'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-IBRS'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v2'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='hle'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='rtm'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v3'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v4'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Skylake-Server-v5'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512bw'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512cd'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512dq'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512f'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='avx512vl'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='invpcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pcid'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='pku'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Snowridge'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='mpx'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v2'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v3'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='core-capability'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='split-lock-detect'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='Snowridge-v4'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='cldemote'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='erms'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='gfni'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdir64b'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='movdiri'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='xsaves'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='athlon'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='athlon-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='core2duo'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='core2duo-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='coreduo'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='coreduo-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='n270'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='n270-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='ss'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='phenom'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <blockers model='phenom-v1'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnow'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <feature name='3dnowext'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </blockers>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </mode>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <memoryBacking supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <enum name='sourceType'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <value>file</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <value>anonymous</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <value>memfd</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   </memoryBacking>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <disk supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='diskDevice'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>disk</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>cdrom</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>floppy</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>lun</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>ide</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>fdc</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>sata</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <graphics supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>vnc</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>egl-headless</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <video supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='modelType'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>vga</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>cirrus</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>none</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>bochs</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>ramfb</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </video>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <hostdev supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='mode'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>subsystem</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='startupPolicy'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>mandatory</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>requisite</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>optional</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='subsysType'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>pci</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>scsi</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='capsType'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='pciBackend'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </hostdev>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <rng supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio-transitional</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtio-non-transitional</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>random</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>egd</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <filesystem supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='driverType'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>path</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>handle</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>virtiofs</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </filesystem>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <tpm supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>tpm-tis</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>tpm-crb</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>emulator</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>external</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='backendVersion'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>2.0</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </tpm>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <redirdev supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='bus'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>usb</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </redirdev>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <channel supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </channel>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <crypto supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='model'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>qemu</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='backendModel'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>builtin</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </crypto>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <interface supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='backendType'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>default</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>passt</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <panic supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='model'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>isa</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>hyperv</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </panic>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <console supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='type'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>null</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>vc</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>pty</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>dev</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>file</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>pipe</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>stdio</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>udp</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>tcp</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>unix</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>qemu-vdagent</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>dbus</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </console>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <features>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <gic supported='no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <vmcoreinfo supported='yes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <genid supported='yes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <backingStoreInput supported='yes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <backup supported='yes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <async-teardown supported='yes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <s390-pv supported='no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <ps2 supported='yes'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <tdx supported='no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <sev supported='no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <sgx supported='no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <hyperv supported='yes'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <enum name='features'>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>relaxed</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>vapic</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>spinlocks</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>vpindex</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>runtime</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>synic</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>stimer</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>reset</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>vendor_id</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>frequencies</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>reenlightenment</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>tlbflush</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>ipi</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>avic</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>emsr_bitmap</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <value>xmm_input</value>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </enum>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       <defaults>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <spinlocks>4095</spinlocks>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <stimer_direct>on</stimer_direct>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <tlbflush_direct>on</tlbflush_direct>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <tlbflush_extended>on</tlbflush_extended>
Jan 22 22:09:01 compute-0 nova_compute[182725]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 22:09:01 compute-0 nova_compute[182725]:       </defaults>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     </hyperv>
Jan 22 22:09:01 compute-0 nova_compute[182725]:     <launchSecurity supported='no'/>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   </features>
Jan 22 22:09:01 compute-0 nova_compute[182725]: </domainCapabilities>
Jan 22 22:09:01 compute-0 nova_compute[182725]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.944 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.945 182729 INFO nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Secure Boot support detected
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.948 182729 INFO nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.948 182729 INFO nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.963 182729 DEBUG nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <model>Nehalem</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]: </cpu>
Jan 22 22:09:01 compute-0 nova_compute[182725]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.967 182729 DEBUG nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:00.987 182729 INFO nova.virt.node [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Determined node identity 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from /var/lib/nova/compute_id
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.010 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Verified node 4f7db789-7f4b-4901-9c88-ecf66d0aff43 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.042 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.114 182729 DEBUG oslo_concurrency.lockutils [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.115 182729 DEBUG oslo_concurrency.lockutils [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.116 182729 DEBUG oslo_concurrency.lockutils [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.116 182729 DEBUG nova.compute.resource_tracker [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.285 182729 WARNING nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.286 182729 DEBUG nova.compute.resource_tracker [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6186MB free_disk=73.58792114257812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.286 182729 DEBUG oslo_concurrency.lockutils [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.286 182729 DEBUG oslo_concurrency.lockutils [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.414 182729 DEBUG nova.compute.resource_tracker [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.414 182729 DEBUG nova.compute.resource_tracker [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.490 182729 DEBUG nova.scheduler.client.report [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.509 182729 DEBUG nova.scheduler.client.report [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.509 182729 DEBUG nova.compute.provider_tree [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.533 182729 DEBUG nova.scheduler.client.report [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.551 182729 DEBUG nova.scheduler.client.report [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.571 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 22 22:09:01 compute-0 nova_compute[182725]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.571 182729 INFO nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] kernel doesn't support AMD SEV
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.572 182729 DEBUG nova.compute.provider_tree [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.572 182729 DEBUG nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.574 182729 DEBUG nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Libvirt baseline CPU <cpu>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <arch>x86_64</arch>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <model>Nehalem</model>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <vendor>AMD</vendor>
Jan 22 22:09:01 compute-0 nova_compute[182725]:   <topology sockets="8" cores="1" threads="1"/>
Jan 22 22:09:01 compute-0 nova_compute[182725]: </cpu>
Jan 22 22:09:01 compute-0 nova_compute[182725]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.639 182729 DEBUG nova.scheduler.client.report [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Updated inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.640 182729 DEBUG nova.compute.provider_tree [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Updating resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.640 182729 DEBUG nova.compute.provider_tree [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.720 182729 DEBUG nova.compute.provider_tree [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Updating resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.745 182729 DEBUG nova.compute.resource_tracker [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.746 182729 DEBUG oslo_concurrency.lockutils [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.746 182729 DEBUG nova.service [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.818 182729 DEBUG nova.service [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 22 22:09:01 compute-0 nova_compute[182725]: 2026-01-22 22:09:01.819 182729 DEBUG nova.servicegroup.drivers.db [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 22 22:09:04 compute-0 sshd-session[183023]: Accepted publickey for zuul from 192.168.122.30 port 59630 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 22:09:04 compute-0 systemd-logind[801]: New session 25 of user zuul.
Jan 22 22:09:04 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 22 22:09:04 compute-0 sshd-session[183023]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 22:09:05 compute-0 python3.9[183176]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 22:09:07 compute-0 sudo[183330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtcpcfnqtuvtxezfmjkuuokezoruwpke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119746.5332453-68-276590036474012/AnsiballZ_systemd_service.py'
Jan 22 22:09:07 compute-0 sudo[183330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:07 compute-0 python3.9[183332]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:09:07 compute-0 systemd[1]: Reloading.
Jan 22 22:09:07 compute-0 systemd-rc-local-generator[183359]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:09:07 compute-0 systemd-sysv-generator[183362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:09:07 compute-0 sudo[183330]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:08 compute-0 python3.9[183517]: ansible-ansible.builtin.service_facts Invoked
Jan 22 22:09:08 compute-0 network[183534]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 22:09:08 compute-0 network[183535]: 'network-scripts' will be removed from distribution in near future.
Jan 22 22:09:08 compute-0 network[183536]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 22:09:12 compute-0 podman[183632]: 2026-01-22 22:09:12.26958406 +0000 UTC m=+0.148135413 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:09:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:09:12.417 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:09:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:09:12.417 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:09:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:09:12.418 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:09:14 compute-0 sudo[183832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anexsexyfrqemlvheomuwffqdvresgxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119753.9328427-125-276867932865183/AnsiballZ_systemd_service.py'
Jan 22 22:09:14 compute-0 sudo[183832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:14 compute-0 python3.9[183834]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:09:14 compute-0 sudo[183832]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:15 compute-0 sudo[183985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvrxpzzfowirpnkraumwnbremrrasagb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119755.0567753-155-47289963368858/AnsiballZ_file.py'
Jan 22 22:09:15 compute-0 sudo[183985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:15 compute-0 python3.9[183987]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:15 compute-0 sudo[183985]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:15 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:09:15 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:09:16 compute-0 sudo[184138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwcnuwipnclgufsdozvvlqrumntrkwwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119756.0022163-179-85592541610864/AnsiballZ_file.py'
Jan 22 22:09:16 compute-0 sudo[184138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:16 compute-0 python3.9[184140]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:16 compute-0 sudo[184138]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:17 compute-0 sudo[184290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipgiratgentqpteobxkpspowfucyekqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119756.8535967-206-166524692284085/AnsiballZ_command.py'
Jan 22 22:09:17 compute-0 sudo[184290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:17 compute-0 python3.9[184292]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:09:17 compute-0 sudo[184290]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:18 compute-0 python3.9[184444]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 22:09:19 compute-0 podman[184521]: 2026-01-22 22:09:19.176691308 +0000 UTC m=+0.101467317 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 22:09:19 compute-0 sudo[184611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajslzzugnrotlmcwvjvnzsipqsjklcbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119758.9592245-260-267031048679448/AnsiballZ_systemd_service.py'
Jan 22 22:09:19 compute-0 sudo[184611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:19 compute-0 python3.9[184613]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:09:19 compute-0 systemd[1]: Reloading.
Jan 22 22:09:19 compute-0 systemd-rc-local-generator[184641]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:09:19 compute-0 systemd-sysv-generator[184644]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:09:19 compute-0 sudo[184611]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:20 compute-0 sudo[184798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfxxrxzderffhoqpzhkegljrtfrybadd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119760.2728791-284-152693755828143/AnsiballZ_command.py'
Jan 22 22:09:20 compute-0 sudo[184798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:20 compute-0 python3.9[184800]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:09:20 compute-0 sudo[184798]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:21 compute-0 sudo[184951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjorshybtfgqnrfqhxfdhtriyiseuboj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119761.06623-311-109606355510730/AnsiballZ_file.py'
Jan 22 22:09:21 compute-0 sudo[184951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:21 compute-0 python3.9[184953]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:21 compute-0 sudo[184951]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:22 compute-0 python3.9[185103]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:09:23 compute-0 sudo[185255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kefjasiholygwpkfutjfmtqifxcdzwsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119762.876835-359-237184154747019/AnsiballZ_group.py'
Jan 22 22:09:23 compute-0 sudo[185255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:23 compute-0 python3.9[185257]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 22 22:09:23 compute-0 sudo[185255]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:24 compute-0 sudo[185407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhrtkamnqhpvrgtbuwefbixpwwfszdtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119764.0602667-392-76130304970778/AnsiballZ_getent.py'
Jan 22 22:09:24 compute-0 sudo[185407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:24 compute-0 python3.9[185409]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 22 22:09:24 compute-0 sudo[185407]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:25 compute-0 sudo[185560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svyrhbhmwqlffjxrwqvpcrhbfhdiqjjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119764.952911-416-217716937778101/AnsiballZ_group.py'
Jan 22 22:09:25 compute-0 sudo[185560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:25 compute-0 python3.9[185562]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 22:09:25 compute-0 groupadd[185563]: group added to /etc/group: name=ceilometer, GID=42405
Jan 22 22:09:26 compute-0 groupadd[185563]: group added to /etc/gshadow: name=ceilometer
Jan 22 22:09:26 compute-0 groupadd[185563]: new group: name=ceilometer, GID=42405
Jan 22 22:09:26 compute-0 sudo[185560]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:26 compute-0 sudo[185718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvulojxmkbifychytkbdozwjidxuwdhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119766.2757509-440-23982616792185/AnsiballZ_user.py'
Jan 22 22:09:26 compute-0 sudo[185718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:27 compute-0 python3.9[185720]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 22:09:27 compute-0 useradd[185722]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 22 22:09:27 compute-0 useradd[185722]: add 'ceilometer' to group 'libvirt'
Jan 22 22:09:27 compute-0 useradd[185722]: add 'ceilometer' to shadow group 'libvirt'
Jan 22 22:09:28 compute-0 sudo[185718]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:29 compute-0 python3.9[185878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:30 compute-0 python3.9[185999]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769119769.0068662-518-249087935399437/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:30 compute-0 python3.9[186149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:31 compute-0 python3.9[186270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769119770.3156984-518-150248742978144/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:31 compute-0 python3.9[186420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:32 compute-0 python3.9[186541]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769119771.4654253-518-28509999116447/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:33 compute-0 python3.9[186691]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:09:34 compute-0 python3.9[186843]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:09:34 compute-0 python3.9[186995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:35 compute-0 python3.9[187116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119774.4573953-695-218998453680717/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:36 compute-0 python3.9[187266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:36 compute-0 python3.9[187387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119775.6833155-695-221110548228113/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:37 compute-0 python3.9[187537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:38 compute-0 python3.9[187658]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119777.057136-782-136017950448193/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:39 compute-0 python3.9[187808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:40 compute-0 python3.9[187929]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119778.750869-830-35835873945894/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:40 compute-0 python3.9[188079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:41 compute-0 python3.9[188200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119780.3352442-875-6819278896900/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:42 compute-0 python3.9[188350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:42 compute-0 podman[188445]: 2026-01-22 22:09:42.84022055 +0000 UTC m=+0.124866411 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:09:42 compute-0 python3.9[188484]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119781.7914293-920-3963581085751/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:43 compute-0 sudo[188650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grgxrbdogrcwbysficlibpdlezayyoyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119783.225806-965-209865507975884/AnsiballZ_file.py'
Jan 22 22:09:43 compute-0 sudo[188650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:43 compute-0 python3.9[188652]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:43 compute-0 sudo[188650]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:44 compute-0 sudo[188802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnegdfzbzikzuawkdbppeaziluayldsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119783.9461706-989-19812598740140/AnsiballZ_file.py'
Jan 22 22:09:44 compute-0 sudo[188802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:44 compute-0 python3.9[188804]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:44 compute-0 sudo[188802]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:45 compute-0 python3.9[188954]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:09:46 compute-0 python3.9[189106]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:09:46 compute-0 python3.9[189258]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:09:47 compute-0 sudo[189410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjrzepdlvmxqluytlhcpgiepxbwhuuua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119787.1272225-1085-42846599001910/AnsiballZ_file.py'
Jan 22 22:09:47 compute-0 sudo[189410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:47 compute-0 python3.9[189412]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:47 compute-0 sudo[189410]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:48 compute-0 sudo[189562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vccgzngettpzplscacokkrjoqjutpjbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119787.9742725-1109-234735685647637/AnsiballZ_systemd_service.py'
Jan 22 22:09:48 compute-0 sudo[189562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:48 compute-0 python3.9[189564]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:09:48 compute-0 systemd[1]: Reloading.
Jan 22 22:09:48 compute-0 systemd-rc-local-generator[189592]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:09:48 compute-0 systemd-sysv-generator[189596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:09:49 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 22 22:09:49 compute-0 sudo[189562]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:49 compute-0 podman[189603]: 2026-01-22 22:09:49.40227395 +0000 UTC m=+0.082611053 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 22:09:50 compute-0 sudo[189772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmjzjsmnphnfnozmqxvruxmjjkfkdlra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119789.7821777-1136-125521750854505/AnsiballZ_stat.py'
Jan 22 22:09:50 compute-0 sudo[189772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:50 compute-0 python3.9[189774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:50 compute-0 sudo[189772]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:50 compute-0 sudo[189895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqkrlzzcarpnyrsoiopoiagubinqngqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119789.7821777-1136-125521750854505/AnsiballZ_copy.py'
Jan 22 22:09:50 compute-0 sudo[189895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:50 compute-0 python3.9[189897]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119789.7821777-1136-125521750854505/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:50 compute-0 sudo[189895]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:51 compute-0 sudo[189971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhxebfcaxpvqnhzbhtstckkomokdujaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119789.7821777-1136-125521750854505/AnsiballZ_stat.py'
Jan 22 22:09:51 compute-0 sudo[189971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:51 compute-0 python3.9[189973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:51 compute-0 sudo[189971]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:51 compute-0 sudo[190094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skmpcdcjruohenxqdovkryxoeunahdbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119789.7821777-1136-125521750854505/AnsiballZ_copy.py'
Jan 22 22:09:51 compute-0 sudo[190094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:52 compute-0 python3.9[190096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119789.7821777-1136-125521750854505/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:52 compute-0 sudo[190094]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:52 compute-0 nova_compute[182725]: 2026-01-22 22:09:52.822 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:52 compute-0 nova_compute[182725]: 2026-01-22 22:09:52.876 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:53 compute-0 sudo[190246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wirlstuudakparopeasqfohxtzncvctp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119792.8066564-1232-220671046914332/AnsiballZ_file.py'
Jan 22 22:09:53 compute-0 sudo[190246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:53 compute-0 python3.9[190248]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:53 compute-0 sudo[190246]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:53 compute-0 sudo[190398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfbzklmcrvzwyjbyrwysfbeycdtwqkuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119793.5900571-1256-125667364312544/AnsiballZ_file.py'
Jan 22 22:09:53 compute-0 sudo[190398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:54 compute-0 python3.9[190400]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:09:54 compute-0 sudo[190398]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:54 compute-0 sudo[190550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxnbneajamddqvliwpwvpkzdmaxmaduo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119794.3279076-1280-35981648038629/AnsiballZ_stat.py'
Jan 22 22:09:54 compute-0 sudo[190550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:54 compute-0 python3.9[190552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:09:54 compute-0 sudo[190550]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:55 compute-0 sudo[190673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkggftxduhjvaubdupbvrywrdvhugsnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119794.3279076-1280-35981648038629/AnsiballZ_copy.py'
Jan 22 22:09:55 compute-0 sudo[190673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:55 compute-0 python3.9[190675]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119794.3279076-1280-35981648038629/.source.json _original_basename=.upgdlk3d follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:55 compute-0 sudo[190673]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:56 compute-0 python3.9[190825]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:09:58 compute-0 sudo[191246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uukndinzcekcnxrqmaffctstxmmcqwqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119797.9887686-1400-120458588783288/AnsiballZ_container_config_data.py'
Jan 22 22:09:58 compute-0 sudo[191246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:58 compute-0 python3.9[191248]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 22 22:09:58 compute-0 sudo[191246]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:59 compute-0 sudo[191398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsllacusnjnexfuhyorcooxbhcaiaxws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119799.1145442-1433-126395718560568/AnsiballZ_container_config_hash.py'
Jan 22 22:09:59 compute-0 sudo[191398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:09:59 compute-0 python3.9[191400]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 22:09:59 compute-0 sudo[191398]: pam_unix(sudo:session): session closed for user root
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.891 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.891 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.907 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.907 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.907 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.909 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.909 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.909 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.939 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.939 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.940 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:09:59 compute-0 nova_compute[182725]: 2026-01-22 22:09:59.940 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.126 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.127 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6147MB free_disk=73.58576202392578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.127 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.127 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.204 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.204 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.224 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.237 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.238 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:10:00 compute-0 nova_compute[182725]: 2026-01-22 22:10:00.238 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:10:00 compute-0 sudo[191550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmnhsgiugcveafvsumhvtqejfifabdrt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119800.1721284-1463-35539721443916/AnsiballZ_edpm_container_manage.py'
Jan 22 22:10:00 compute-0 sudo[191550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:00 compute-0 python3[191552]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 22:10:01 compute-0 podman[191587]: 2026-01-22 22:10:01.152997313 +0000 UTC m=+0.057456936 container create ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 22:10:01 compute-0 podman[191587]: 2026-01-22 22:10:01.121135585 +0000 UTC m=+0.025595238 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 22 22:10:01 compute-0 python3[191552]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=d88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 22 22:10:01 compute-0 sudo[191550]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:01 compute-0 sudo[191774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csvllqcjitgaonojcsnlxtmsyyefnlfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119801.4701989-1487-86381100552840/AnsiballZ_stat.py'
Jan 22 22:10:01 compute-0 sudo[191774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:01 compute-0 python3.9[191776]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:10:01 compute-0 sudo[191774]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:02 compute-0 sudo[191928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxralxxermomzcwjlivpgxhsjftdgmno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119802.2975519-1514-113107941742941/AnsiballZ_file.py'
Jan 22 22:10:02 compute-0 sudo[191928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:02 compute-0 python3.9[191930]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:02 compute-0 sudo[191928]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:02 compute-0 sudo[192004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfntdlcmyumhzgrlfoxdptnpyeuqqbnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119802.2975519-1514-113107941742941/AnsiballZ_stat.py'
Jan 22 22:10:02 compute-0 sudo[192004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:03 compute-0 python3.9[192006]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:10:03 compute-0 sudo[192004]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:03 compute-0 sudo[192155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyfurxtxgthiexegvdqgeezvxubxdphh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119803.202676-1514-44474587985763/AnsiballZ_copy.py'
Jan 22 22:10:03 compute-0 sudo[192155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:03 compute-0 python3.9[192157]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119803.202676-1514-44474587985763/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:03 compute-0 sudo[192155]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:04 compute-0 sudo[192231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqikcrfqznywvfeertjpvhuvuwdvegnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119803.202676-1514-44474587985763/AnsiballZ_systemd.py'
Jan 22 22:10:04 compute-0 sudo[192231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:04 compute-0 python3.9[192233]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:10:04 compute-0 systemd[1]: Reloading.
Jan 22 22:10:04 compute-0 systemd-rc-local-generator[192260]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:10:04 compute-0 systemd-sysv-generator[192264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:10:05 compute-0 sudo[192231]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:05 compute-0 sudo[192342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnqynilfufcrvmquteniisnrftxdtofm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119803.202676-1514-44474587985763/AnsiballZ_systemd.py'
Jan 22 22:10:05 compute-0 sudo[192342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:05 compute-0 python3.9[192344]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:10:05 compute-0 systemd[1]: Reloading.
Jan 22 22:10:05 compute-0 systemd-sysv-generator[192378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:10:05 compute-0 systemd-rc-local-generator[192375]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:10:06 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Jan 22 22:10:06 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:10:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d1863394f3b2840afd96c9363d6ee1ad50b368150808a0c3c17c91f12ba6189/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d1863394f3b2840afd96c9363d6ee1ad50b368150808a0c3c17c91f12ba6189/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d1863394f3b2840afd96c9363d6ee1ad50b368150808a0c3c17c91f12ba6189/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d1863394f3b2840afd96c9363d6ee1ad50b368150808a0c3c17c91f12ba6189/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:07 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3.
Jan 22 22:10:07 compute-0 podman[192385]: 2026-01-22 22:10:07.84227444 +0000 UTC m=+1.464111816 container init ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + sudo -E kolla_set_configs
Jan 22 22:10:07 compute-0 sudo[192407]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 22 22:10:07 compute-0 podman[192385]: 2026-01-22 22:10:07.879684302 +0000 UTC m=+1.501521628 container start ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: sudo: unable to send audit message: Operation not permitted
Jan 22 22:10:07 compute-0 sudo[192407]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 22 22:10:07 compute-0 sudo[192407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Validating config file
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Copying service configuration files
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: INFO:__main__:Writing out command to execute
Jan 22 22:10:07 compute-0 sudo[192407]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: ++ cat /run_command
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + ARGS=
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + sudo kolla_copy_cacerts
Jan 22 22:10:07 compute-0 sudo[192422]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: sudo: unable to send audit message: Operation not permitted
Jan 22 22:10:07 compute-0 sudo[192422]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 22 22:10:07 compute-0 sudo[192422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 22 22:10:07 compute-0 sudo[192422]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + [[ ! -n '' ]]
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + . kolla_extend_start
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + umask 0022
Jan 22 22:10:07 compute-0 ceilometer_agent_compute[192401]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 22 22:10:08 compute-0 podman[192385]: ceilometer_agent_compute
Jan 22 22:10:08 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Jan 22 22:10:08 compute-0 sudo[192342]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:08 compute-0 podman[192408]: 2026-01-22 22:10:08.141873103 +0000 UTC m=+0.246188566 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 22:10:08 compute-0 systemd[1]: ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3-2a73e24eadd9bdae.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 22:10:08 compute-0 systemd[1]: ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3-2a73e24eadd9bdae.service: Failed with result 'exit-code'.
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.852 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.852 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.852 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.852 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.852 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.852 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.852 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.853 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.854 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.855 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.856 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.857 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.858 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.860 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.861 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.862 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.863 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.864 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.865 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.866 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.886 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.888 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.889 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 22 22:10:08 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:08.988 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 22 22:10:09 compute-0 python3.9[192582]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.067 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.067 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.067 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.067 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.068 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.069 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.070 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.071 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.072 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.073 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.074 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.075 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.076 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.077 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.078 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.079 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.080 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.081 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.082 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.083 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.084 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.085 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.086 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.087 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.087 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.087 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.087 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.087 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.087 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.087 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.088 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.090 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.090 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.093 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.101 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:10:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:10:10 compute-0 sudo[192738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkrqensbjnbghaoyzicqgdfrmeqklmbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119809.825595-1649-134299841429684/AnsiballZ_stat.py'
Jan 22 22:10:10 compute-0 sudo[192738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:10 compute-0 python3.9[192740]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:10 compute-0 sudo[192738]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:10 compute-0 sudo[192863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbwvyvscizztegnhwxcnralpkiqjkxab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119809.825595-1649-134299841429684/AnsiballZ_copy.py'
Jan 22 22:10:10 compute-0 sudo[192863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:10 compute-0 python3.9[192865]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119809.825595-1649-134299841429684/.source.yaml _original_basename=.nnkqojmb follow=False checksum=58334609f985ec5c9646ffb07fd0ee42148b1595 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:11 compute-0 sudo[192863]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:11 compute-0 sudo[193015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fekkdzpazallkwwmgwfrfhcxeprmaohq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119811.1977766-1694-214434962104201/AnsiballZ_stat.py'
Jan 22 22:10:11 compute-0 sudo[193015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:11 compute-0 python3.9[193017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:11 compute-0 sudo[193015]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:12 compute-0 sudo[193138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aalkqhtskzijeidglnnkvidlrbhspdvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119811.1977766-1694-214434962104201/AnsiballZ_copy.py'
Jan 22 22:10:12 compute-0 sudo[193138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:12 compute-0 python3.9[193140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119811.1977766-1694-214434962104201/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:10:12 compute-0 sudo[193138]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:10:12.418 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:10:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:10:12.419 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:10:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:10:12.420 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:10:13 compute-0 podman[193217]: 2026-01-22 22:10:13.173185868 +0000 UTC m=+0.103091945 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:10:13 compute-0 sudo[193316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxyhlnsajubzizmfokqavhqedmkpftiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119812.956394-1757-49486721156908/AnsiballZ_file.py'
Jan 22 22:10:13 compute-0 sudo[193316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:13 compute-0 python3.9[193318]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:13 compute-0 sudo[193316]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:14 compute-0 sudo[193468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdllduwrurypnxqqvsievoqbkweirhvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119813.7881815-1781-240623242030434/AnsiballZ_file.py'
Jan 22 22:10:14 compute-0 sudo[193468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:14 compute-0 python3.9[193470]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:10:14 compute-0 sudo[193468]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:14 compute-0 sudo[193620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynravyybsybnglodfuhuzbjageyurvzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119814.5018811-1805-263395396009496/AnsiballZ_stat.py'
Jan 22 22:10:14 compute-0 sudo[193620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:15 compute-0 python3.9[193622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:15 compute-0 sudo[193620]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:15 compute-0 sudo[193698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vympkphejgjfiztgxuvqxetrolbpbmlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119814.5018811-1805-263395396009496/AnsiballZ_file.py'
Jan 22 22:10:15 compute-0 sudo[193698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:15 compute-0 python3.9[193700]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.yd7g5vc0 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:15 compute-0 sudo[193698]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:16 compute-0 python3.9[193850]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:18 compute-0 sudo[194271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeldvddnohwalsfjpwbqbuxcmrbmnlpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119818.0157967-1916-232848191128386/AnsiballZ_container_config_data.py'
Jan 22 22:10:18 compute-0 sudo[194271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:18 compute-0 python3.9[194273]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 22 22:10:18 compute-0 sudo[194271]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:19 compute-0 sudo[194423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtmqojjsajxgajnhtdysjvutfevbcgaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119819.029381-1949-142826182544068/AnsiballZ_container_config_hash.py'
Jan 22 22:10:19 compute-0 sudo[194423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:19 compute-0 python3.9[194425]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 22:10:19 compute-0 sudo[194423]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:20 compute-0 podman[194496]: 2026-01-22 22:10:20.12068026 +0000 UTC m=+0.053070881 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 22:10:20 compute-0 sudo[194594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgycrqqlqgfrdmpxbfprisxsewnqkymt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119819.9897096-1979-268436488341727/AnsiballZ_edpm_container_manage.py'
Jan 22 22:10:20 compute-0 sudo[194594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:20 compute-0 python3[194596]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 22:10:20 compute-0 podman[194630]: 2026-01-22 22:10:20.7643751 +0000 UTC m=+0.069518433 container create 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:10:20 compute-0 podman[194630]: 2026-01-22 22:10:20.720054899 +0000 UTC m=+0.025198282 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 22 22:10:20 compute-0 python3[194596]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 22 22:10:20 compute-0 sudo[194594]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:21 compute-0 sudo[194818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqjxxkkcxjppwhutuwbtahqicvknxvew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119821.2206275-2003-92038935599834/AnsiballZ_stat.py'
Jan 22 22:10:21 compute-0 sudo[194818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:21 compute-0 python3.9[194820]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:10:21 compute-0 sudo[194818]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:22 compute-0 sudo[194972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbxsgvpwctikjmzoxuzncadrcgswnizo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119822.0237935-2030-25659368842868/AnsiballZ_file.py'
Jan 22 22:10:22 compute-0 sudo[194972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:22 compute-0 python3.9[194974]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:22 compute-0 sudo[194972]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:22 compute-0 sudo[195048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alsobalgjkoqyhnklpcwtzlfqaicbawb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119822.0237935-2030-25659368842868/AnsiballZ_stat.py'
Jan 22 22:10:22 compute-0 sudo[195048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:22 compute-0 python3.9[195050]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:10:23 compute-0 sudo[195048]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:23 compute-0 sudo[195199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjnpbefdzmtkperbdekonsucewdwnkmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119823.0727367-2030-85509195771438/AnsiballZ_copy.py'
Jan 22 22:10:23 compute-0 sudo[195199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:23 compute-0 python3.9[195201]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119823.0727367-2030-85509195771438/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:23 compute-0 sudo[195199]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:24 compute-0 rsyslogd[1008]: imjournal from <np0005592765:sudo>: begin to drop messages due to rate-limiting
Jan 22 22:10:24 compute-0 sudo[195275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjqoodcfrprdsopnuhmsftycyvczdsru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119823.0727367-2030-85509195771438/AnsiballZ_systemd.py'
Jan 22 22:10:24 compute-0 sudo[195275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:24 compute-0 python3.9[195277]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:10:24 compute-0 systemd[1]: Reloading.
Jan 22 22:10:24 compute-0 systemd-rc-local-generator[195301]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:10:24 compute-0 systemd-sysv-generator[195307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:10:24 compute-0 sudo[195275]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:25 compute-0 sudo[195386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgrhgvqpbqjbhqmeqtflqsvdruqxrely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119823.0727367-2030-85509195771438/AnsiballZ_systemd.py'
Jan 22 22:10:25 compute-0 sudo[195386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:25 compute-0 python3.9[195388]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:10:25 compute-0 systemd[1]: Reloading.
Jan 22 22:10:25 compute-0 systemd-rc-local-generator[195417]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:10:25 compute-0 systemd-sysv-generator[195420]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:10:26 compute-0 systemd[1]: Starting node_exporter container...
Jan 22 22:10:26 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:10:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c1c241c6e1e5f1d64ff6ff3a17f9d7244aae94b3705c10a4a9200bf3e25ee2f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c1c241c6e1e5f1d64ff6ff3a17f9d7244aae94b3705c10a4a9200bf3e25ee2f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:26 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53.
Jan 22 22:10:26 compute-0 podman[195427]: 2026-01-22 22:10:26.367371811 +0000 UTC m=+0.305083765 container init 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.387Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.387Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.387Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=arp
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=bcache
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=bonding
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=cpu
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=edac
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=filefd
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=netclass
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=netdev
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=netstat
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=nfs
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=nvme
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=softnet
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=systemd
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=xfs
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.388Z caller=node_exporter.go:117 level=info collector=zfs
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.389Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 22 22:10:26 compute-0 node_exporter[195443]: ts=2026-01-22T22:10:26.389Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 22 22:10:26 compute-0 podman[195427]: 2026-01-22 22:10:26.413559379 +0000 UTC m=+0.351271273 container start 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:10:26 compute-0 podman[195427]: node_exporter
Jan 22 22:10:26 compute-0 systemd[1]: Started node_exporter container.
Jan 22 22:10:26 compute-0 sudo[195386]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:26 compute-0 podman[195453]: 2026-01-22 22:10:26.69775373 +0000 UTC m=+0.270504469 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:10:27 compute-0 python3.9[195627]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 22:10:28 compute-0 sudo[195777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muqoujsjfszfqoameyrjswtcadpmrojt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119828.411761-2165-116875532652052/AnsiballZ_stat.py'
Jan 22 22:10:28 compute-0 sudo[195777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:28 compute-0 python3.9[195779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:29 compute-0 sudo[195777]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:29 compute-0 sudo[195902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srbpdnmhpyomzzjjtrkuuqviapfdloqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119828.411761-2165-116875532652052/AnsiballZ_copy.py'
Jan 22 22:10:29 compute-0 sudo[195902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:29 compute-0 python3.9[195904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119828.411761-2165-116875532652052/.source.yaml _original_basename=.7gmt4uee follow=False checksum=e19705558811577fcab5aa70c84a932fb5df0a2c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:29 compute-0 sudo[195902]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:30 compute-0 sudo[196054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfolriqimbpglvbibrofhcujwnnvbwfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119829.8433044-2210-275946835739721/AnsiballZ_stat.py'
Jan 22 22:10:30 compute-0 sudo[196054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:30 compute-0 python3.9[196056]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:30 compute-0 sudo[196054]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:30 compute-0 sudo[196177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtxkitwwecvgbomaftshskrvoowvwaws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119829.8433044-2210-275946835739721/AnsiballZ_copy.py'
Jan 22 22:10:30 compute-0 sudo[196177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:30 compute-0 python3.9[196179]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119829.8433044-2210-275946835739721/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:10:30 compute-0 sudo[196177]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:31 compute-0 sudo[196329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-korgxgebrxdxncgwkkzjpveltoipbcqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119831.6565168-2273-9132917739594/AnsiballZ_file.py'
Jan 22 22:10:31 compute-0 sudo[196329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:32 compute-0 python3.9[196331]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:32 compute-0 sudo[196329]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:32 compute-0 sudo[196481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqzxjsnpyupzshuqxfesxybslixfiaxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119832.4279807-2297-25293703893919/AnsiballZ_file.py'
Jan 22 22:10:32 compute-0 sudo[196481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:33 compute-0 python3.9[196483]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:10:33 compute-0 sudo[196481]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:33 compute-0 sudo[196633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thmcuizjarqbkjabdcuxqisploovslyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119833.2443748-2321-68094412613743/AnsiballZ_stat.py'
Jan 22 22:10:33 compute-0 sudo[196633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:33 compute-0 python3.9[196635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:33 compute-0 sudo[196633]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:34 compute-0 sudo[196711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptiaspfgdpdvrowrrfbxkkepvtriyvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119833.2443748-2321-68094412613743/AnsiballZ_file.py'
Jan 22 22:10:34 compute-0 sudo[196711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:34 compute-0 python3.9[196713]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.vxcbwrt0 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:34 compute-0 sudo[196711]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:35 compute-0 python3.9[196863]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:37 compute-0 sudo[197284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgmnvkpmxvnjzslpjlketlawldxazrjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119836.886152-2432-193074190288029/AnsiballZ_container_config_data.py'
Jan 22 22:10:37 compute-0 sudo[197284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:37 compute-0 python3.9[197286]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 22 22:10:37 compute-0 sudo[197284]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:38 compute-0 sudo[197448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fazcsiyeyfjcjwbenbakupdairckkjud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119837.9664624-2465-52949022203500/AnsiballZ_container_config_hash.py'
Jan 22 22:10:38 compute-0 sudo[197448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:38 compute-0 podman[197410]: 2026-01-22 22:10:38.364287765 +0000 UTC m=+0.085681088 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:10:38 compute-0 systemd[1]: ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3-2a73e24eadd9bdae.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 22:10:38 compute-0 systemd[1]: ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3-2a73e24eadd9bdae.service: Failed with result 'exit-code'.
Jan 22 22:10:38 compute-0 python3.9[197454]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 22:10:38 compute-0 sudo[197448]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:39 compute-0 sudo[197608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-najcnczelbchzmcsiggozjihnntnvfwy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119839.0063953-2495-110613700099388/AnsiballZ_edpm_container_manage.py'
Jan 22 22:10:39 compute-0 sudo[197608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:39 compute-0 python3[197610]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 22:10:43 compute-0 podman[197623]: 2026-01-22 22:10:43.528761109 +0000 UTC m=+3.786945346 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 22 22:10:43 compute-0 podman[197720]: 2026-01-22 22:10:43.729978211 +0000 UTC m=+0.040385522 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 22 22:10:44 compute-0 podman[197720]: 2026-01-22 22:10:44.291193655 +0000 UTC m=+0.601600916 container create 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:10:44 compute-0 python3[197610]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 22 22:10:44 compute-0 podman[197733]: 2026-01-22 22:10:44.309827852 +0000 UTC m=+0.247379610 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:10:44 compute-0 sudo[197608]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:45 compute-0 sudo[197933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfgrfkxkkngdztexaojfdhekngiqiosb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119844.7512863-2519-248670808178067/AnsiballZ_stat.py'
Jan 22 22:10:45 compute-0 sudo[197933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:45 compute-0 python3.9[197935]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:10:45 compute-0 sudo[197933]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:46 compute-0 sudo[198087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzaccqxvybjazoepaoogtcbvegfxxhtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119846.0565152-2546-236806887027121/AnsiballZ_file.py'
Jan 22 22:10:46 compute-0 sudo[198087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:46 compute-0 python3.9[198089]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:46 compute-0 sudo[198087]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:47 compute-0 sudo[198163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrnjmxkhxbhsdmtyzbvbkjtrksjyhkoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119846.0565152-2546-236806887027121/AnsiballZ_stat.py'
Jan 22 22:10:47 compute-0 sudo[198163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:47 compute-0 python3.9[198165]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:10:47 compute-0 sudo[198163]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:48 compute-0 sudo[198314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzllheygfpsaerphfryfhdmjdlmfmhtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119847.5147126-2546-99363422890821/AnsiballZ_copy.py'
Jan 22 22:10:48 compute-0 sudo[198314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:48 compute-0 python3.9[198316]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119847.5147126-2546-99363422890821/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:48 compute-0 sudo[198314]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:48 compute-0 sudo[198390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buzfshxjbzxbctoobmfhhlgwjabpexnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119847.5147126-2546-99363422890821/AnsiballZ_systemd.py'
Jan 22 22:10:48 compute-0 sudo[198390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:49 compute-0 python3.9[198392]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:10:49 compute-0 systemd[1]: Reloading.
Jan 22 22:10:49 compute-0 systemd-sysv-generator[198424]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:10:49 compute-0 systemd-rc-local-generator[198421]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:10:49 compute-0 sudo[198390]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:49 compute-0 sudo[198502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwlhqfwzpboxwnpijxhnypvyjxsvwetf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119847.5147126-2546-99363422890821/AnsiballZ_systemd.py'
Jan 22 22:10:49 compute-0 sudo[198502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:50 compute-0 python3.9[198504]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:10:50 compute-0 systemd[1]: Reloading.
Jan 22 22:10:50 compute-0 podman[198506]: 2026-01-22 22:10:50.450045735 +0000 UTC m=+0.085954045 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:10:50 compute-0 systemd-sysv-generator[198552]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:10:50 compute-0 systemd-rc-local-generator[198549]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:10:50 compute-0 systemd[1]: Starting podman_exporter container...
Jan 22 22:10:50 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:10:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab53c072f7d2a844b1a5b134337c50da82eb0c9d2767b1ab9eaf4a1295c3bfc6/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab53c072f7d2a844b1a5b134337c50da82eb0c9d2767b1ab9eaf4a1295c3bfc6/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 22:10:50 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26.
Jan 22 22:10:50 compute-0 podman[198562]: 2026-01-22 22:10:50.818696643 +0000 UTC m=+0.124745207 container init 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:10:50 compute-0 podman_exporter[198577]: ts=2026-01-22T22:10:50.837Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 22 22:10:50 compute-0 podman_exporter[198577]: ts=2026-01-22T22:10:50.837Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 22 22:10:50 compute-0 podman_exporter[198577]: ts=2026-01-22T22:10:50.837Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 22 22:10:50 compute-0 podman_exporter[198577]: ts=2026-01-22T22:10:50.837Z caller=handler.go:105 level=info collector=container
Jan 22 22:10:50 compute-0 podman[198562]: 2026-01-22 22:10:50.847001272 +0000 UTC m=+0.153049856 container start 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:10:50 compute-0 podman[198562]: podman_exporter
Jan 22 22:10:50 compute-0 systemd[1]: Starting Podman API Service...
Jan 22 22:10:50 compute-0 systemd[1]: Started Podman API Service.
Jan 22 22:10:50 compute-0 systemd[1]: Started podman_exporter container.
Jan 22 22:10:50 compute-0 podman[198588]: time="2026-01-22T22:10:50Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 22 22:10:50 compute-0 podman[198588]: time="2026-01-22T22:10:50Z" level=info msg="Setting parallel job count to 25"
Jan 22 22:10:50 compute-0 podman[198588]: time="2026-01-22T22:10:50Z" level=info msg="Using sqlite as database backend"
Jan 22 22:10:50 compute-0 podman[198588]: time="2026-01-22T22:10:50Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 22 22:10:50 compute-0 podman[198588]: time="2026-01-22T22:10:50Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 22 22:10:50 compute-0 podman[198588]: time="2026-01-22T22:10:50Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 22 22:10:50 compute-0 sudo[198502]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:50 compute-0 podman[198588]: @ - - [22/Jan/2026:22:10:50 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 22 22:10:50 compute-0 podman[198588]: time="2026-01-22T22:10:50Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 22 22:10:50 compute-0 podman[198588]: @ - - [22/Jan/2026:22:10:50 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18076 "" "Go-http-client/1.1"
Jan 22 22:10:50 compute-0 podman_exporter[198577]: ts=2026-01-22T22:10:50.918Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 22 22:10:50 compute-0 podman_exporter[198577]: ts=2026-01-22T22:10:50.918Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 22 22:10:50 compute-0 podman_exporter[198577]: ts=2026-01-22T22:10:50.918Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 22 22:10:50 compute-0 podman[198586]: 2026-01-22 22:10:50.922862063 +0000 UTC m=+0.058526978 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:10:50 compute-0 systemd[1]: 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26-6358bfa16c194c34.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 22:10:50 compute-0 systemd[1]: 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26-6358bfa16c194c34.service: Failed with result 'exit-code'.
Jan 22 22:10:52 compute-0 python3.9[198773]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 22:10:54 compute-0 sudo[198923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shetxgawkazqhgqfyxavymvoifzmksrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119853.5824354-2681-259465965933355/AnsiballZ_stat.py'
Jan 22 22:10:54 compute-0 sudo[198923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:54 compute-0 python3.9[198925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:54 compute-0 sudo[198923]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:54 compute-0 sudo[199048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cczhfoarxtbglfibmudhitiusofcimru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119853.5824354-2681-259465965933355/AnsiballZ_copy.py'
Jan 22 22:10:54 compute-0 sudo[199048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:54 compute-0 python3.9[199050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119853.5824354-2681-259465965933355/.source.yaml _original_basename=.qo2hpaha follow=False checksum=9afb18950d57de435859b2e59656b3033d5fd3a9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:54 compute-0 sudo[199048]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:55 compute-0 sudo[199200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdszouszchqtgsswbwtbicyxihhzipqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119855.314124-2726-219482704044014/AnsiballZ_stat.py'
Jan 22 22:10:55 compute-0 sudo[199200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:55 compute-0 python3.9[199202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:55 compute-0 sudo[199200]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:56 compute-0 sudo[199323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyadffyvhfpijrrllafryfxvuflrjrol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119855.314124-2726-219482704044014/AnsiballZ_copy.py'
Jan 22 22:10:56 compute-0 sudo[199323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:56 compute-0 python3.9[199325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119855.314124-2726-219482704044014/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:10:56 compute-0 sudo[199323]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:57 compute-0 podman[199350]: 2026-01-22 22:10:57.15903302 +0000 UTC m=+0.074944329 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:10:57 compute-0 sudo[199499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkfrzijdxzbzbrujhdoraszarokthexx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119857.3332775-2789-51532106161735/AnsiballZ_file.py'
Jan 22 22:10:57 compute-0 sudo[199499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:57 compute-0 python3.9[199501]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:57 compute-0 sudo[199499]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:58 compute-0 sudo[199651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrfqoacnvbylbvuwmeygjkjkzipxfafu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119858.203487-2813-52559825584105/AnsiballZ_file.py'
Jan 22 22:10:58 compute-0 sudo[199651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:58 compute-0 python3.9[199653]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 22:10:58 compute-0 sudo[199651]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:59 compute-0 sudo[199803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grovqhwkdsavuquuvzehqkhwmmcobvdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119858.94527-2837-279079883504917/AnsiballZ_stat.py'
Jan 22 22:10:59 compute-0 sudo[199803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:59 compute-0 python3.9[199805]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:10:59 compute-0 sudo[199803]: pam_unix(sudo:session): session closed for user root
Jan 22 22:10:59 compute-0 sudo[199881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prvinvybfhrakkulezuguaayapgjczwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119858.94527-2837-279079883504917/AnsiballZ_file.py'
Jan 22 22:10:59 compute-0 sudo[199881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:10:59 compute-0 python3.9[199883]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.bc5xy78_ recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:10:59 compute-0 sudo[199881]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.230 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.231 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.252 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.252 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.253 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.273 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.274 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 python3.9[200033]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.919 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.920 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.920 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:11:00 compute-0 nova_compute[182725]: 2026-01-22 22:11:00.920 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.124 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.125 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5988MB free_disk=73.55096054077148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.125 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.125 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.220 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.221 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.246 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.280 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.283 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:11:01 compute-0 nova_compute[182725]: 2026-01-22 22:11:01.283 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:11:02 compute-0 nova_compute[182725]: 2026-01-22 22:11:02.283 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:02 compute-0 sudo[200454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-runslndqvlclzwypwcwoddazsmcllndr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119862.5106828-2948-213917813752010/AnsiballZ_container_config_data.py'
Jan 22 22:11:02 compute-0 sudo[200454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:03 compute-0 python3.9[200456]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 22 22:11:03 compute-0 sudo[200454]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:03 compute-0 sudo[200606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auwqlnyfwfgybsspxewptlekbhgaaibm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119863.5323207-2981-138265281213037/AnsiballZ_container_config_hash.py'
Jan 22 22:11:03 compute-0 sudo[200606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:04 compute-0 python3.9[200608]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 22:11:04 compute-0 sudo[200606]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:04 compute-0 sudo[200758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgsbeszmlspuaeknqkeapvppjfzdtmjg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119864.6000104-3011-2378314359563/AnsiballZ_edpm_container_manage.py'
Jan 22 22:11:04 compute-0 sudo[200758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:05 compute-0 python3[200760]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 22:11:08 compute-0 podman[200834]: 2026-01-22 22:11:08.744526592 +0000 UTC m=+0.145649271 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 22:11:08 compute-0 systemd[1]: ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3-2a73e24eadd9bdae.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 22:11:08 compute-0 systemd[1]: ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3-2a73e24eadd9bdae.service: Failed with result 'exit-code'.
Jan 22 22:11:09 compute-0 podman[200775]: 2026-01-22 22:11:09.39929047 +0000 UTC m=+4.064001138 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 22 22:11:09 compute-0 podman[200891]: 2026-01-22 22:11:09.546062788 +0000 UTC m=+0.027661924 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 22 22:11:10 compute-0 podman[200891]: 2026-01-22 22:11:10.081672138 +0000 UTC m=+0.563271254 container create cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 22:11:10 compute-0 python3[200760]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 22 22:11:10 compute-0 sudo[200758]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:11 compute-0 sudo[201078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejesexkaxrctfyyivqysybkrbhdvhsjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119870.947368-3035-40843214466409/AnsiballZ_stat.py'
Jan 22 22:11:11 compute-0 sudo[201078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:11 compute-0 python3.9[201080]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:11:11 compute-0 sudo[201078]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:12 compute-0 sudo[201232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cusknimevttpgizttqcdpxxtxamhwlfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119871.8017783-3062-122727069067388/AnsiballZ_file.py'
Jan 22 22:11:12 compute-0 sudo[201232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:12 compute-0 python3.9[201234]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:12 compute-0 sudo[201232]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:11:12.419 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:11:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:11:12.420 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:11:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:11:12.420 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:11:12 compute-0 sudo[201308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puyuyakmjrumujpqfkqltpcnjexvhtkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119871.8017783-3062-122727069067388/AnsiballZ_stat.py'
Jan 22 22:11:12 compute-0 sudo[201308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:12 compute-0 python3.9[201310]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:11:12 compute-0 sudo[201308]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:13 compute-0 sudo[201459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmbwwuxcdpboxewboqehrurcwouqiaym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119872.8375251-3062-117558245521036/AnsiballZ_copy.py'
Jan 22 22:11:13 compute-0 sudo[201459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:13 compute-0 python3.9[201461]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119872.8375251-3062-117558245521036/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:13 compute-0 sudo[201459]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:13 compute-0 sudo[201535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kofiieqlfggkbpsosdounqgelvqytpsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119872.8375251-3062-117558245521036/AnsiballZ_systemd.py'
Jan 22 22:11:13 compute-0 sudo[201535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:13 compute-0 python3.9[201537]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 22:11:13 compute-0 systemd[1]: Reloading.
Jan 22 22:11:14 compute-0 systemd-sysv-generator[201563]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:11:14 compute-0 systemd-rc-local-generator[201559]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:11:14 compute-0 sudo[201535]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:14 compute-0 podman[201574]: 2026-01-22 22:11:14.636430883 +0000 UTC m=+0.131948808 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:11:14 compute-0 sudo[201675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvxvbbjxrlksqiidwujkdkziyxmfvye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119872.8375251-3062-117558245521036/AnsiballZ_systemd.py'
Jan 22 22:11:14 compute-0 sudo[201675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:15 compute-0 python3.9[201677]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 22:11:15 compute-0 systemd[1]: Reloading.
Jan 22 22:11:15 compute-0 systemd-rc-local-generator[201707]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 22:11:15 compute-0 systemd-sysv-generator[201710]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 22:11:15 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 22 22:11:16 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:11:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b52f05b0c2601611aaf47b7840f8be58f2e5f2bb1105875d9b6104ff0f0a1c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 22 22:11:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b52f05b0c2601611aaf47b7840f8be58f2e5f2bb1105875d9b6104ff0f0a1c/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 22:11:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b52f05b0c2601611aaf47b7840f8be58f2e5f2bb1105875d9b6104ff0f0a1c/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 22:11:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393.
Jan 22 22:11:16 compute-0 podman[201717]: 2026-01-22 22:11:16.595168907 +0000 UTC m=+0.728449436 container init cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc.)
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *bridge.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *coverage.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *datapath.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *iface.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *memory.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *ovn.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *pmd_perf.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *pmd_rxq.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: INFO    22:11:16 main.go:48: registering *vswitch.Collector
Jan 22 22:11:16 compute-0 openstack_network_exporter[201732]: NOTICE  22:11:16 main.go:76: listening on https://:9105/metrics
Jan 22 22:11:16 compute-0 podman[201717]: 2026-01-22 22:11:16.636477741 +0000 UTC m=+0.769758300 container start cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7)
Jan 22 22:11:16 compute-0 podman[201717]: openstack_network_exporter
Jan 22 22:11:16 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 22 22:11:16 compute-0 sudo[201675]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:16 compute-0 podman[201742]: 2026-01-22 22:11:16.754679684 +0000 UTC m=+0.099689520 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Jan 22 22:11:17 compute-0 python3.9[201914]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 22:11:18 compute-0 sudo[202064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huvrchmifbynslfnalcnikezhlakjkhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119878.3921618-3197-92582101841527/AnsiballZ_stat.py'
Jan 22 22:11:18 compute-0 sudo[202064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:18 compute-0 python3.9[202066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:11:18 compute-0 sudo[202064]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:19 compute-0 sudo[202189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuaajqoxnwixzyirrazfeiqrxkjnddkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119878.3921618-3197-92582101841527/AnsiballZ_copy.py'
Jan 22 22:11:19 compute-0 sudo[202189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:19 compute-0 python3.9[202191]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119878.3921618-3197-92582101841527/.source.yaml _original_basename=.1untjxij follow=False checksum=e73648dbb2ed8394dae4b77dfc11eea50b7aef3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:19 compute-0 sudo[202189]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:19 compute-0 auditd[705]: Audit daemon rotating log files
Jan 22 22:11:20 compute-0 sudo[202341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-styeemjgdfkwqhovsfkcxjhozquwrxvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119879.746255-3242-264063465538028/AnsiballZ_find.py'
Jan 22 22:11:20 compute-0 sudo[202341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:20 compute-0 python3.9[202343]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 22:11:20 compute-0 sudo[202341]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:21 compute-0 podman[202439]: 2026-01-22 22:11:21.132701426 +0000 UTC m=+0.060626545 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 22:11:21 compute-0 podman[202444]: 2026-01-22 22:11:21.145919091 +0000 UTC m=+0.058903640 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:11:21 compute-0 sudo[202534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuetywtwbwojpmfeqjcqczkpqiicvgzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119880.783446-3270-177612674692751/AnsiballZ_podman_container_info.py'
Jan 22 22:11:21 compute-0 sudo[202534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:21 compute-0 python3.9[202536]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 22 22:11:21 compute-0 sudo[202534]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:22 compute-0 sudo[202699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ginzgpvnmzbgsnnhvmfdlcazubmmdtuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119881.6461673-3278-16430063230735/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:22 compute-0 sudo[202699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:22 compute-0 python3.9[202701]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:22 compute-0 systemd[1]: Started libpod-conmon-c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0.scope.
Jan 22 22:11:22 compute-0 podman[202702]: 2026-01-22 22:11:22.909083287 +0000 UTC m=+0.572401636 container exec c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:11:22 compute-0 podman[202702]: 2026-01-22 22:11:22.945468537 +0000 UTC m=+0.608786896 container exec_died c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 22:11:23 compute-0 sudo[202699]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:23 compute-0 systemd[1]: libpod-conmon-c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0.scope: Deactivated successfully.
Jan 22 22:11:23 compute-0 sudo[202884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krrzltrcrvukiioswcvoqzmntngthnmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119883.4387212-3286-5843365600074/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:23 compute-0 sudo[202884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:23 compute-0 python3.9[202886]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:24 compute-0 systemd[1]: Started libpod-conmon-c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0.scope.
Jan 22 22:11:24 compute-0 podman[202887]: 2026-01-22 22:11:24.243071719 +0000 UTC m=+0.292141393 container exec c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:11:24 compute-0 podman[202887]: 2026-01-22 22:11:24.511315317 +0000 UTC m=+0.560385041 container exec_died c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:11:24 compute-0 sudo[202884]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:24 compute-0 systemd[1]: libpod-conmon-c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0.scope: Deactivated successfully.
Jan 22 22:11:25 compute-0 sudo[203069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfdwrbjlpfvmrqgimocgpcfigwsipxnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119884.7244396-3294-31264931838099/AnsiballZ_file.py'
Jan 22 22:11:25 compute-0 sudo[203069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:25 compute-0 python3.9[203071]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:25 compute-0 sudo[203069]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:25 compute-0 sudo[203221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lefgxmhyuirrnucvzxicitwoxccwvdvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119885.531344-3303-113189710063951/AnsiballZ_podman_container_info.py'
Jan 22 22:11:25 compute-0 sudo[203221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:26 compute-0 python3.9[203223]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 22 22:11:26 compute-0 sudo[203221]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:26 compute-0 sudo[203386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkfhnzqtfohutkyksefcmtzligsldpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119886.3678317-3311-124894690191604/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:26 compute-0 sudo[203386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:26 compute-0 python3.9[203388]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:27 compute-0 systemd[1]: Started libpod-conmon-7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462.scope.
Jan 22 22:11:27 compute-0 podman[203389]: 2026-01-22 22:11:27.089763963 +0000 UTC m=+0.202873702 container exec 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 22:11:27 compute-0 podman[203409]: 2026-01-22 22:11:27.172919186 +0000 UTC m=+0.064530388 container exec_died 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:11:27 compute-0 podman[203389]: 2026-01-22 22:11:27.413561593 +0000 UTC m=+0.526671262 container exec_died 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:11:27 compute-0 systemd[1]: libpod-conmon-7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462.scope: Deactivated successfully.
Jan 22 22:11:27 compute-0 sudo[203386]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:27 compute-0 podman[203422]: 2026-01-22 22:11:27.874170186 +0000 UTC m=+0.416736878 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:11:28 compute-0 sudo[203596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdgkjydarbzzxrhqhmdgsoqcbnzzyhay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119888.0160382-3319-44858612407036/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:28 compute-0 sudo[203596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:28 compute-0 python3.9[203598]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:29 compute-0 systemd[1]: Started libpod-conmon-7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462.scope.
Jan 22 22:11:29 compute-0 podman[203599]: 2026-01-22 22:11:29.052364038 +0000 UTC m=+0.407533659 container exec 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Jan 22 22:11:29 compute-0 podman[203599]: 2026-01-22 22:11:29.062867222 +0000 UTC m=+0.418036813 container exec_died 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:11:29 compute-0 systemd[1]: libpod-conmon-7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462.scope: Deactivated successfully.
Jan 22 22:11:29 compute-0 sudo[203596]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:29 compute-0 sudo[203780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrslfegnsarhwvztvsqteeofbbbdaizk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119889.3215168-3327-158949335187377/AnsiballZ_file.py'
Jan 22 22:11:29 compute-0 sudo[203780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:29 compute-0 python3.9[203782]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:29 compute-0 sudo[203780]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:30 compute-0 sudo[203932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poeutqogrkfkcavfjysonsinqbmhsaci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119890.0806427-3336-31159139002722/AnsiballZ_podman_container_info.py'
Jan 22 22:11:30 compute-0 sudo[203932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:30 compute-0 python3.9[203934]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 22 22:11:30 compute-0 sudo[203932]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:31 compute-0 sudo[204097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqiuiyscpocnucgpdnglorwcacroowzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119890.824627-3344-238314809548251/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:31 compute-0 sudo[204097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:31 compute-0 python3.9[204099]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:31 compute-0 systemd[1]: Started libpod-conmon-ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3.scope.
Jan 22 22:11:31 compute-0 podman[204100]: 2026-01-22 22:11:31.54341612 +0000 UTC m=+0.195765455 container exec ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 22 22:11:31 compute-0 podman[204100]: 2026-01-22 22:11:31.575761495 +0000 UTC m=+0.228110830 container exec_died ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:11:31 compute-0 systemd[1]: libpod-conmon-ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3.scope: Deactivated successfully.
Jan 22 22:11:31 compute-0 sudo[204097]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:32 compute-0 sudo[204282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cycojsxlanwhnjvtcukwxdgyjggztolp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119891.7938097-3352-15936469724643/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:32 compute-0 sudo[204282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:32 compute-0 python3.9[204284]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:32 compute-0 systemd[1]: Started libpod-conmon-ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3.scope.
Jan 22 22:11:32 compute-0 podman[204285]: 2026-01-22 22:11:32.342711832 +0000 UTC m=+0.072573907 container exec ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 22:11:32 compute-0 podman[204285]: 2026-01-22 22:11:32.377116381 +0000 UTC m=+0.106978436 container exec_died ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 22:11:32 compute-0 systemd[1]: libpod-conmon-ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3.scope: Deactivated successfully.
Jan 22 22:11:32 compute-0 sudo[204282]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:32 compute-0 sudo[204465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbqbopaxnslatwqgflzxatksebtshbsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119892.5602736-3360-106964932517234/AnsiballZ_file.py'
Jan 22 22:11:32 compute-0 sudo[204465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:33 compute-0 python3.9[204467]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:33 compute-0 sudo[204465]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:33 compute-0 sudo[204617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jozlknlgahhlrbbbbdnblowlavlvasyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119893.2712953-3369-223829332021395/AnsiballZ_podman_container_info.py'
Jan 22 22:11:33 compute-0 sudo[204617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:33 compute-0 python3.9[204619]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 22 22:11:33 compute-0 sudo[204617]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:34 compute-0 sudo[204782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixcklrzjmhjpvohvfatmwikyhbmpyxhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119894.0111337-3377-65245280008023/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:34 compute-0 sudo[204782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:34 compute-0 python3.9[204784]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:34 compute-0 systemd[1]: Started libpod-conmon-08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53.scope.
Jan 22 22:11:34 compute-0 podman[204785]: 2026-01-22 22:11:34.762189514 +0000 UTC m=+0.240739610 container exec 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:11:35 compute-0 podman[204804]: 2026-01-22 22:11:35.075829118 +0000 UTC m=+0.293743925 container exec_died 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:11:35 compute-0 podman[204785]: 2026-01-22 22:11:35.107021763 +0000 UTC m=+0.585571839 container exec_died 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:11:35 compute-0 systemd[1]: libpod-conmon-08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53.scope: Deactivated successfully.
Jan 22 22:11:35 compute-0 sudo[204782]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:35 compute-0 sudo[204966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drvninepjhpfxgmbjqzwossxorjwlcku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119895.3720558-3385-7946582520238/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:35 compute-0 sudo[204966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:35 compute-0 python3.9[204968]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:36 compute-0 systemd[1]: Started libpod-conmon-08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53.scope.
Jan 22 22:11:36 compute-0 podman[204969]: 2026-01-22 22:11:36.076487501 +0000 UTC m=+0.074419805 container exec 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:11:36 compute-0 podman[204969]: 2026-01-22 22:11:36.111364902 +0000 UTC m=+0.109297186 container exec_died 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:11:36 compute-0 systemd[1]: libpod-conmon-08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53.scope: Deactivated successfully.
Jan 22 22:11:36 compute-0 sudo[204966]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:36 compute-0 sudo[205150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipbjbxfpijmwdzhtlgulajdpztgnajnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119896.471336-3393-3800632247106/AnsiballZ_file.py'
Jan 22 22:11:36 compute-0 sudo[205150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:37 compute-0 python3.9[205152]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:37 compute-0 sudo[205150]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:37 compute-0 sudo[205302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbmclzybjtdumoncnzpybwmfrlxxrkrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119897.3216329-3402-163978382052642/AnsiballZ_podman_container_info.py'
Jan 22 22:11:37 compute-0 sudo[205302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:37 compute-0 python3.9[205304]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 22 22:11:37 compute-0 sudo[205302]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:38 compute-0 sudo[205467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aclsfdxnrayurvunjbaysridjybdzvua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119898.1039307-3410-48378236186549/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:38 compute-0 sudo[205467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:38 compute-0 python3.9[205469]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:38 compute-0 systemd[1]: Started libpod-conmon-86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26.scope.
Jan 22 22:11:38 compute-0 podman[205470]: 2026-01-22 22:11:38.9471464 +0000 UTC m=+0.290114700 container exec 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:11:39 compute-0 podman[205489]: 2026-01-22 22:11:39.015045544 +0000 UTC m=+0.056650721 container exec_died 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:11:39 compute-0 podman[205470]: 2026-01-22 22:11:39.045069479 +0000 UTC m=+0.388037689 container exec_died 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:11:39 compute-0 systemd[1]: libpod-conmon-86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26.scope: Deactivated successfully.
Jan 22 22:11:39 compute-0 sudo[205467]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:39 compute-0 podman[205502]: 2026-01-22 22:11:39.227616448 +0000 UTC m=+0.156262004 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:11:39 compute-0 sudo[205668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixhyijkstkuyumtiufgnzvlmplckyuki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119899.4914706-3418-174486435141434/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:39 compute-0 sudo[205668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:40 compute-0 python3.9[205670]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:40 compute-0 systemd[1]: Started libpod-conmon-86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26.scope.
Jan 22 22:11:40 compute-0 podman[205671]: 2026-01-22 22:11:40.175329889 +0000 UTC m=+0.149514228 container exec 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:11:40 compute-0 podman[205671]: 2026-01-22 22:11:40.429838638 +0000 UTC m=+0.404022957 container exec_died 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:11:40 compute-0 sudo[205668]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:40 compute-0 systemd[1]: libpod-conmon-86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26.scope: Deactivated successfully.
Jan 22 22:11:41 compute-0 sudo[205852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvjudfsnapovsnbbyowgdwbjfmafgeyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119900.8212304-3426-19296962270581/AnsiballZ_file.py'
Jan 22 22:11:41 compute-0 sudo[205852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:41 compute-0 python3.9[205854]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:41 compute-0 sudo[205852]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:41 compute-0 sudo[206004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbkkhkyvfrizwymkpxfbyatymjsyqywy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119901.553873-3435-28772922360918/AnsiballZ_podman_container_info.py'
Jan 22 22:11:41 compute-0 sudo[206004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:42 compute-0 python3.9[206006]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 22 22:11:42 compute-0 sudo[206004]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:43 compute-0 sudo[206169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuiewtgzuuifopywpjdydayjglnrqvsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119902.7666118-3443-166484969879508/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:43 compute-0 sudo[206169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:45 compute-0 podman[206172]: 2026-01-22 22:11:45.223647363 +0000 UTC m=+0.141191600 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 22:11:45 compute-0 python3.9[206171]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:45 compute-0 systemd[1]: Started libpod-conmon-cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393.scope.
Jan 22 22:11:45 compute-0 podman[206199]: 2026-01-22 22:11:45.785770478 +0000 UTC m=+0.096910072 container exec cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 22:11:45 compute-0 podman[206199]: 2026-01-22 22:11:45.821497612 +0000 UTC m=+0.132637216 container exec_died cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 22:11:45 compute-0 systemd[1]: libpod-conmon-cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393.scope: Deactivated successfully.
Jan 22 22:11:45 compute-0 sudo[206169]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:46 compute-0 sudo[206385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkjejjuzkggmssdfdkgtjgfjvzsyolfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119906.0666807-3451-169952937292577/AnsiballZ_podman_container_exec.py'
Jan 22 22:11:46 compute-0 sudo[206385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:46 compute-0 python3.9[206387]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 22 22:11:46 compute-0 systemd[1]: Started libpod-conmon-cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393.scope.
Jan 22 22:11:46 compute-0 podman[206388]: 2026-01-22 22:11:46.711109684 +0000 UTC m=+0.065637486 container exec cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public)
Jan 22 22:11:46 compute-0 podman[206388]: 2026-01-22 22:11:46.742966686 +0000 UTC m=+0.097494468 container exec_died cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public)
Jan 22 22:11:46 compute-0 systemd[1]: libpod-conmon-cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393.scope: Deactivated successfully.
Jan 22 22:11:46 compute-0 sudo[206385]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:46 compute-0 podman[206417]: 2026-01-22 22:11:46.860346583 +0000 UTC m=+0.067597527 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 22:11:47 compute-0 sudo[206586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlnyrhbzpzwfzwcizldpimbwgohtworf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119906.9574885-3459-196067615223743/AnsiballZ_file.py'
Jan 22 22:11:47 compute-0 sudo[206586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:11:47 compute-0 python3.9[206588]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:11:47 compute-0 sudo[206586]: pam_unix(sudo:session): session closed for user root
Jan 22 22:11:52 compute-0 podman[206613]: 2026-01-22 22:11:52.131485088 +0000 UTC m=+0.062190036 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:11:52 compute-0 podman[206614]: 2026-01-22 22:11:52.135652057 +0000 UTC m=+0.061596471 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:11:58 compute-0 podman[206653]: 2026-01-22 22:11:58.110559468 +0000 UTC m=+0.047460851 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:11:59 compute-0 nova_compute[182725]: 2026-01-22 22:11:59.885 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:11:59 compute-0 nova_compute[182725]: 2026-01-22 22:11:59.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:00 compute-0 nova_compute[182725]: 2026-01-22 22:12:00.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.913 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.914 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.915 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.915 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.916 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.916 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.943 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.944 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.945 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:12:01 compute-0 nova_compute[182725]: 2026-01-22 22:12:01.945 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.169 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.170 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5963MB free_disk=73.4159927368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.171 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.172 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.256 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.257 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.283 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.300 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.302 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:12:02 compute-0 nova_compute[182725]: 2026-01-22 22:12:02.302 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:12:03 compute-0 nova_compute[182725]: 2026-01-22 22:12:03.277 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:12:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:12:10 compute-0 podman[206677]: 2026-01-22 22:12:10.208362299 +0000 UTC m=+0.126180658 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:12:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:12:12.420 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:12:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:12:12.420 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:12:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:12:12.420 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:12:16 compute-0 podman[206698]: 2026-01-22 22:12:16.162680122 +0000 UTC m=+0.093468273 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:12:17 compute-0 podman[206725]: 2026-01-22 22:12:17.189092179 +0000 UTC m=+0.112172742 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Jan 22 22:12:23 compute-0 podman[206746]: 2026-01-22 22:12:23.165581109 +0000 UTC m=+0.093064928 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:12:23 compute-0 podman[206747]: 2026-01-22 22:12:23.173938514 +0000 UTC m=+0.092792991 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:12:29 compute-0 sudo[206930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asbuqocpymnccrxfsdioqkbjzgvctrpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119948.8154938-3866-232449179414968/AnsiballZ_file.py'
Jan 22 22:12:29 compute-0 podman[206888]: 2026-01-22 22:12:29.103666622 +0000 UTC m=+0.054818777 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:12:29 compute-0 sudo[206930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:29 compute-0 python3.9[206940]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:29 compute-0 sudo[206930]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:29 compute-0 sudo[207091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzyumwyxgkzvipnyaemdjbuzcwgcaiqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119949.570779-3890-83763005775692/AnsiballZ_stat.py'
Jan 22 22:12:29 compute-0 sudo[207091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:30 compute-0 python3.9[207093]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:30 compute-0 sudo[207091]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:30 compute-0 sudo[207214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzrbuphofbxsoycxyemzzcynjsmhryeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119949.570779-3890-83763005775692/AnsiballZ_copy.py'
Jan 22 22:12:30 compute-0 sudo[207214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:30 compute-0 python3.9[207216]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119949.570779-3890-83763005775692/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:30 compute-0 sudo[207214]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:31 compute-0 sudo[207366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upfiqtfxutlctdtjlqraqfomwghlsswg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119951.034972-3938-176015631552961/AnsiballZ_file.py'
Jan 22 22:12:31 compute-0 sudo[207366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:31 compute-0 python3.9[207368]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:31 compute-0 sudo[207366]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:32 compute-0 sudo[207518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjhcwemfqmoptufbvkkfvxqatxnrgosc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119951.7620158-3962-434054002598/AnsiballZ_stat.py'
Jan 22 22:12:32 compute-0 sudo[207518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:32 compute-0 python3.9[207520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:32 compute-0 sudo[207518]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:32 compute-0 sudo[207596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vicvulmkkkkpxiavjdnsauksvevwrndj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119951.7620158-3962-434054002598/AnsiballZ_file.py'
Jan 22 22:12:32 compute-0 sudo[207596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:32 compute-0 python3.9[207598]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:32 compute-0 sudo[207596]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:33 compute-0 sudo[207748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yefewpqzszgowqjivcmywcxzhodvjfyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119953.193887-3998-277046033983428/AnsiballZ_stat.py'
Jan 22 22:12:33 compute-0 sudo[207748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:33 compute-0 python3.9[207750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:33 compute-0 sudo[207748]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:33 compute-0 sudo[207826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgwxfiabegjsfbvfukhrylmjhjehdqhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119953.193887-3998-277046033983428/AnsiballZ_file.py'
Jan 22 22:12:33 compute-0 sudo[207826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:34 compute-0 python3.9[207828]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.sr_anryu recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:34 compute-0 sudo[207826]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:34 compute-0 sudo[207978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqfoxnfqvxjqztxnyixaipktchjfnuoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119954.3875902-4034-278465057901533/AnsiballZ_stat.py'
Jan 22 22:12:34 compute-0 sudo[207978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:34 compute-0 python3.9[207980]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:34 compute-0 sudo[207978]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:35 compute-0 sudo[208056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyiycvmbwaojprqligqyslmonwqmmpyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119954.3875902-4034-278465057901533/AnsiballZ_file.py'
Jan 22 22:12:35 compute-0 sudo[208056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:35 compute-0 python3.9[208058]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:35 compute-0 sudo[208056]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:35 compute-0 sudo[208208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjjrjbytjsaxkniwrwgwhlnwyhigbqqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119955.6893797-4073-207510702619146/AnsiballZ_command.py'
Jan 22 22:12:35 compute-0 sudo[208208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:36 compute-0 python3.9[208210]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:12:36 compute-0 sudo[208208]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:36 compute-0 sudo[208361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzbfrojrsvqzxtiiordxwjpgfclcgbnu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769119956.3854094-4097-36667363180350/AnsiballZ_edpm_nftables_from_files.py'
Jan 22 22:12:36 compute-0 sudo[208361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:37 compute-0 python3[208363]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 22:12:37 compute-0 sudo[208361]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:37 compute-0 sudo[208513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dajfftytwonauuigrogscagnnmbkveai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119957.253525-4121-188074802732984/AnsiballZ_stat.py'
Jan 22 22:12:37 compute-0 sudo[208513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:37 compute-0 python3.9[208515]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:37 compute-0 sudo[208513]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:37 compute-0 sudo[208591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjqtjcjjvqmjnjpuvimgkavmlclovhgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119957.253525-4121-188074802732984/AnsiballZ_file.py'
Jan 22 22:12:37 compute-0 sudo[208591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:38 compute-0 python3.9[208593]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:38 compute-0 sudo[208591]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:38 compute-0 sudo[208743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eedscjonmdpfmxijnahkeprclmiddkgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119958.4833803-4157-204071068183215/AnsiballZ_stat.py'
Jan 22 22:12:38 compute-0 sudo[208743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:39 compute-0 python3.9[208745]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:39 compute-0 sudo[208743]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:39 compute-0 sudo[208821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzudtxcobakukfwmfuwkeztpsahlmfgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119958.4833803-4157-204071068183215/AnsiballZ_file.py'
Jan 22 22:12:39 compute-0 sudo[208821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:39 compute-0 python3.9[208823]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:39 compute-0 sudo[208821]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:40 compute-0 sudo[208984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgqiuqwwsggvqsxhhzppialyrjgflvfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119959.7759874-4193-57441225100134/AnsiballZ_stat.py'
Jan 22 22:12:40 compute-0 sudo[208984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:40 compute-0 podman[208947]: 2026-01-22 22:12:40.669413933 +0000 UTC m=+0.061032790 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 22:12:40 compute-0 python3.9[208992]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:40 compute-0 sudo[208984]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:41 compute-0 sudo[209069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukscasanfvzahzwbpflasrzkohhjlpaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119959.7759874-4193-57441225100134/AnsiballZ_file.py'
Jan 22 22:12:41 compute-0 sudo[209069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:41 compute-0 python3.9[209071]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:41 compute-0 sudo[209069]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:42 compute-0 sudo[209221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emqemhwhkvvgjgcjnetqsjmnnixxprro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119961.5069704-4229-157734979441711/AnsiballZ_stat.py'
Jan 22 22:12:42 compute-0 sudo[209221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:42 compute-0 python3.9[209223]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:42 compute-0 sudo[209221]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:42 compute-0 sudo[209299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anvcculcwrpnmnlhgfjilmptdrmxdahi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119961.5069704-4229-157734979441711/AnsiballZ_file.py'
Jan 22 22:12:42 compute-0 sudo[209299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:42 compute-0 python3.9[209301]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:42 compute-0 sudo[209299]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:43 compute-0 sudo[209451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjtnprnrliuvtnxunkwofowaqylykyty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119963.058773-4265-11200189693287/AnsiballZ_stat.py'
Jan 22 22:12:43 compute-0 sudo[209451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:43 compute-0 python3.9[209453]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 22:12:43 compute-0 sudo[209451]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:44 compute-0 sudo[209576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmbgnlqcsmnuosqgfllnzivgewyrfqmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119963.058773-4265-11200189693287/AnsiballZ_copy.py'
Jan 22 22:12:44 compute-0 sudo[209576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:44 compute-0 python3.9[209578]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119963.058773-4265-11200189693287/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:44 compute-0 sudo[209576]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:45 compute-0 sudo[209728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juktqnjwnaaxraitczmjyelnpqgsdrxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119964.7379906-4310-215512014187761/AnsiballZ_file.py'
Jan 22 22:12:45 compute-0 sudo[209728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:45 compute-0 python3.9[209730]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:45 compute-0 sudo[209728]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:45 compute-0 sudo[209880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbayiwazhjntiyjoocdllmplspjxpchm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119965.622671-4334-90573354746165/AnsiballZ_command.py'
Jan 22 22:12:45 compute-0 sudo[209880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:46 compute-0 python3.9[209882]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:12:46 compute-0 sudo[209880]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:47 compute-0 sudo[210048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxdgwmsfrgawrrrgosrmcfmuzaqqfhlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119966.5001895-4358-52066514770789/AnsiballZ_blockinfile.py'
Jan 22 22:12:47 compute-0 sudo[210048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:47 compute-0 podman[210009]: 2026-01-22 22:12:47.057852732 +0000 UTC m=+0.092204817 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:12:47 compute-0 python3.9[210057]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:47 compute-0 sudo[210048]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:47 compute-0 podman[210087]: 2026-01-22 22:12:47.401669329 +0000 UTC m=+0.064675570 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Jan 22 22:12:47 compute-0 sudo[210234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvghzifcsintycnasxifejdjupxhqkfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119967.5637805-4385-146330379519196/AnsiballZ_command.py'
Jan 22 22:12:47 compute-0 sudo[210234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:48 compute-0 python3.9[210236]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:12:48 compute-0 sudo[210234]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:48 compute-0 sudo[210387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvwxvpxralvuoxrnrwmyngqctsvhccqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119968.4112391-4409-270885851472067/AnsiballZ_stat.py'
Jan 22 22:12:48 compute-0 sudo[210387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:48 compute-0 python3.9[210389]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 22:12:48 compute-0 sudo[210387]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:49 compute-0 sudo[210541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nohfxgqqvvwkeyzszkuqatlfohsmrvxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119969.1485603-4433-108544628679777/AnsiballZ_command.py'
Jan 22 22:12:49 compute-0 sudo[210541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:49 compute-0 python3.9[210543]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 22:12:49 compute-0 sudo[210541]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:50 compute-0 sudo[210696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrleavcgmbaupuqostqxautoupnqessy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769119969.958473-4457-209558970983149/AnsiballZ_file.py'
Jan 22 22:12:50 compute-0 sudo[210696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 22:12:50 compute-0 python3.9[210698]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 22:12:50 compute-0 sudo[210696]: pam_unix(sudo:session): session closed for user root
Jan 22 22:12:50 compute-0 sshd-session[183026]: Connection closed by 192.168.122.30 port 59630
Jan 22 22:12:50 compute-0 sshd-session[183023]: pam_unix(sshd:session): session closed for user zuul
Jan 22 22:12:50 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 22 22:12:50 compute-0 systemd[1]: session-25.scope: Consumed 2min 183ms CPU time.
Jan 22 22:12:50 compute-0 systemd-logind[801]: Session 25 logged out. Waiting for processes to exit.
Jan 22 22:12:50 compute-0 systemd-logind[801]: Removed session 25.
Jan 22 22:12:54 compute-0 podman[210724]: 2026-01-22 22:12:54.151847458 +0000 UTC m=+0.060678841 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:12:54 compute-0 podman[210723]: 2026-01-22 22:12:54.17026981 +0000 UTC m=+0.094152583 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 22:13:00 compute-0 podman[210764]: 2026-01-22 22:13:00.118954833 +0000 UTC m=+0.055684828 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:13:00 compute-0 nova_compute[182725]: 2026-01-22 22:13:00.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:00 compute-0 nova_compute[182725]: 2026-01-22 22:13:00.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:01 compute-0 nova_compute[182725]: 2026-01-22 22:13:01.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:01 compute-0 nova_compute[182725]: 2026-01-22 22:13:01.903 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:01 compute-0 nova_compute[182725]: 2026-01-22 22:13:01.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:13:01 compute-0 nova_compute[182725]: 2026-01-22 22:13:01.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:13:01 compute-0 nova_compute[182725]: 2026-01-22 22:13:01.915 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:13:01 compute-0 nova_compute[182725]: 2026-01-22 22:13:01.916 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:01 compute-0 nova_compute[182725]: 2026-01-22 22:13:01.916 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:03 compute-0 nova_compute[182725]: 2026-01-22 22:13:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:03 compute-0 nova_compute[182725]: 2026-01-22 22:13:03.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:03 compute-0 nova_compute[182725]: 2026-01-22 22:13:03.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:13:03 compute-0 nova_compute[182725]: 2026-01-22 22:13:03.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.006 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.007 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.008 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.008 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.239 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.240 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5974MB free_disk=73.41569519042969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.240 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.241 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.328 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.328 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.344 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.355 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.356 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:13:04 compute-0 nova_compute[182725]: 2026-01-22 22:13:04.356 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:13:05 compute-0 nova_compute[182725]: 2026-01-22 22:13:05.356 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:11 compute-0 podman[210788]: 2026-01-22 22:13:11.162507465 +0000 UTC m=+0.092468443 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:13:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:13:12.421 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:13:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:13:12.421 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:13:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:13:12.422 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:13:18 compute-0 podman[210811]: 2026-01-22 22:13:18.16989839 +0000 UTC m=+0.093898948 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 22:13:18 compute-0 podman[210810]: 2026-01-22 22:13:18.199661441 +0000 UTC m=+0.127960234 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 22:13:25 compute-0 podman[210859]: 2026-01-22 22:13:25.166536422 +0000 UTC m=+0.087256375 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:13:25 compute-0 podman[210860]: 2026-01-22 22:13:25.167567997 +0000 UTC m=+0.080186161 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:13:31 compute-0 podman[210902]: 2026-01-22 22:13:31.127804065 +0000 UTC m=+0.052555111 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:13:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:13:32.181 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:13:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:13:32.183 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:13:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:13:32.185 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:13:42 compute-0 podman[210927]: 2026-01-22 22:13:42.128577284 +0000 UTC m=+0.065484669 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:13:49 compute-0 podman[210946]: 2026-01-22 22:13:49.144839588 +0000 UTC m=+0.078143551 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 22:13:49 compute-0 podman[210945]: 2026-01-22 22:13:49.174865406 +0000 UTC m=+0.110948667 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:13:56 compute-0 podman[210991]: 2026-01-22 22:13:56.129437365 +0000 UTC m=+0.060162519 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:13:56 compute-0 podman[210992]: 2026-01-22 22:13:56.151053996 +0000 UTC m=+0.071126238 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:13:59 compute-0 nova_compute[182725]: 2026-01-22 22:13:59.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:59 compute-0 nova_compute[182725]: 2026-01-22 22:13:59.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:13:59 compute-0 nova_compute[182725]: 2026-01-22 22:13:59.914 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:13:59 compute-0 nova_compute[182725]: 2026-01-22 22:13:59.915 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:13:59 compute-0 nova_compute[182725]: 2026-01-22 22:13:59.916 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:13:59 compute-0 nova_compute[182725]: 2026-01-22 22:13:59.940 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:00 compute-0 nova_compute[182725]: 2026-01-22 22:14:00.956 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:01 compute-0 nova_compute[182725]: 2026-01-22 22:14:01.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:01 compute-0 nova_compute[182725]: 2026-01-22 22:14:01.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:01 compute-0 nova_compute[182725]: 2026-01-22 22:14:01.887 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:14:01 compute-0 nova_compute[182725]: 2026-01-22 22:14:01.887 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:14:01 compute-0 nova_compute[182725]: 2026-01-22 22:14:01.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:14:02 compute-0 podman[211031]: 2026-01-22 22:14:02.159863396 +0000 UTC m=+0.090131666 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:14:02 compute-0 nova_compute[182725]: 2026-01-22 22:14:02.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:03 compute-0 nova_compute[182725]: 2026-01-22 22:14:03.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:03 compute-0 nova_compute[182725]: 2026-01-22 22:14:03.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:03 compute-0 nova_compute[182725]: 2026-01-22 22:14:03.891 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:14:03 compute-0 nova_compute[182725]: 2026-01-22 22:14:03.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.669 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.670 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.670 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.670 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.875 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.876 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6043MB free_disk=73.41772079467773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.877 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.877 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.994 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:14:04 compute-0 nova_compute[182725]: 2026-01-22 22:14:04.994 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.057 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.109 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.110 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.127 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.152 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.176 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.195 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.198 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:14:05 compute-0 nova_compute[182725]: 2026-01-22 22:14:05.198 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:14:07 compute-0 nova_compute[182725]: 2026-01-22 22:14:07.196 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:07 compute-0 nova_compute[182725]: 2026-01-22 22:14:07.196 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:14:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:14:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:14:12.422 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:14:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:14:12.422 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:14:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:14:12.422 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:14:13 compute-0 podman[211058]: 2026-01-22 22:14:13.193677748 +0000 UTC m=+0.114148565 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 22:14:20 compute-0 podman[211079]: 2026-01-22 22:14:20.126335748 +0000 UTC m=+0.050053621 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Jan 22 22:14:20 compute-0 podman[211078]: 2026-01-22 22:14:20.193936549 +0000 UTC m=+0.121465736 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 22:14:27 compute-0 podman[211125]: 2026-01-22 22:14:27.13155739 +0000 UTC m=+0.061036341 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:14:27 compute-0 podman[211124]: 2026-01-22 22:14:27.132944264 +0000 UTC m=+0.066665349 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:14:33 compute-0 podman[211164]: 2026-01-22 22:14:33.132871557 +0000 UTC m=+0.065784987 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:14:44 compute-0 podman[211188]: 2026-01-22 22:14:44.146197176 +0000 UTC m=+0.073903466 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:14:51 compute-0 podman[211210]: 2026-01-22 22:14:51.14310279 +0000 UTC m=+0.067599472 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 22:14:51 compute-0 podman[211209]: 2026-01-22 22:14:51.160247691 +0000 UTC m=+0.086297711 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:14:58 compute-0 podman[211254]: 2026-01-22 22:14:58.166212477 +0000 UTC m=+0.090283319 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:14:58 compute-0 podman[211255]: 2026-01-22 22:14:58.168549224 +0000 UTC m=+0.087658104 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:15:01 compute-0 nova_compute[182725]: 2026-01-22 22:15:01.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:01 compute-0 nova_compute[182725]: 2026-01-22 22:15:01.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:15:01 compute-0 nova_compute[182725]: 2026-01-22 22:15:01.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:15:04 compute-0 podman[211298]: 2026-01-22 22:15:04.131164279 +0000 UTC m=+0.069679003 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.595 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.595 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.595 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.596 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.641 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.642 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.642 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.642 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.787 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.789 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6065MB free_disk=73.41773986816406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.790 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.790 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.950 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.951 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:15:04 compute-0 nova_compute[182725]: 2026-01-22 22:15:04.998 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.051 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.053 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.053 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.346 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.347 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.368 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:05 compute-0 nova_compute[182725]: 2026-01-22 22:15:05.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:15:06 compute-0 nova_compute[182725]: 2026-01-22 22:15:06.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:06 compute-0 nova_compute[182725]: 2026-01-22 22:15:06.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:12.327 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:12.328 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:12.423 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:12.423 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:12.424 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:15 compute-0 podman[211324]: 2026-01-22 22:15:15.185222929 +0000 UTC m=+0.111908330 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 22:15:21 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:21.330 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:15:22 compute-0 podman[211346]: 2026-01-22 22:15:22.162377479 +0000 UTC m=+0.092527613 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:15:22 compute-0 podman[211347]: 2026-01-22 22:15:22.186200454 +0000 UTC m=+0.104471977 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.051 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.052 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.072 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.284 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.285 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.294 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.294 182729 INFO nova.compute.claims [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.453 182729 DEBUG nova.compute.provider_tree [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.471 182729 DEBUG nova.scheduler.client.report [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.502 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.503 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.573 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.573 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.609 182729 INFO nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.665 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.882 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.884 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.885 182729 INFO nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Creating image(s)
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.887 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "/var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.887 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "/var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.888 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "/var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.889 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:26 compute-0 nova_compute[182725]: 2026-01-22 22:15:26.891 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:28 compute-0 nova_compute[182725]: 2026-01-22 22:15:28.580 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Automatically allocating a network for project a5c10a85a248465c960e573d380cd07d. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Jan 22 22:15:29 compute-0 podman[211393]: 2026-01-22 22:15:29.155851218 +0000 UTC m=+0.083757178 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 22:15:29 compute-0 podman[211394]: 2026-01-22 22:15:29.15634156 +0000 UTC m=+0.081334298 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:15:30 compute-0 nova_compute[182725]: 2026-01-22 22:15:30.680 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:30 compute-0 nova_compute[182725]: 2026-01-22 22:15:30.740 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.part --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:30 compute-0 nova_compute[182725]: 2026-01-22 22:15:30.741 182729 DEBUG nova.virt.images [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] 48dd0ec8-2856-44d4-b286-44fdc64ba78d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 22 22:15:30 compute-0 nova_compute[182725]: 2026-01-22 22:15:30.742 182729 DEBUG nova.privsep.utils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:15:30 compute-0 nova_compute[182725]: 2026-01-22 22:15:30.743 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.part /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:30 compute-0 nova_compute[182725]: 2026-01-22 22:15:30.951 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.part /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.converted" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:30 compute-0 nova_compute[182725]: 2026-01-22 22:15:30.959 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.029 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e.converted --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.030 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.043 182729 INFO oslo.privsep.daemon [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpwb376g4j/privsep.sock']
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.773 182729 INFO oslo.privsep.daemon [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Spawned new privsep daemon via rootwrap
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.625 211457 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.629 211457 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.632 211457 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.632 211457 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211457
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.886 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.975 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.976 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.977 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:31 compute-0 nova_compute[182725]: 2026-01-22 22:15:31.988 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.053 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.055 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.111 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.113 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.113 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.214 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.216 182729 DEBUG nova.virt.disk.api [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Checking if we can resize image /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.216 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.287 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.288 182729 DEBUG nova.virt.disk.api [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Cannot resize image /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.289 182729 DEBUG nova.objects.instance [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lazy-loading 'migration_context' on Instance uuid 0428b0ef-005b-41a0-9d4b-6c37db082797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.313 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.314 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Ensure instance console log exists: /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.315 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.315 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:32 compute-0 nova_compute[182725]: 2026-01-22 22:15:32.315 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.061 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Acquiring lock "cf308097-1be0-4cc4-ac3f-57f33504bcba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.062 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "cf308097-1be0-4cc4-ac3f-57f33504bcba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.093 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.207 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.207 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.214 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.214 182729 INFO nova.compute.claims [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.400 182729 DEBUG nova.compute.provider_tree [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.435 182729 ERROR nova.scheduler.client.report [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [req-ae370a2e-2795-4b42-9b72-fcd89f19c059] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 4f7db789-7f4b-4901-9c88-ecf66d0aff43.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-ae370a2e-2795-4b42-9b72-fcd89f19c059"}]}
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.456 182729 DEBUG nova.scheduler.client.report [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.488 182729 DEBUG nova.scheduler.client.report [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.488 182729 DEBUG nova.compute.provider_tree [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.511 182729 DEBUG nova.scheduler.client.report [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.571 182729 DEBUG nova.scheduler.client.report [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.652 182729 DEBUG nova.compute.provider_tree [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.712 182729 DEBUG nova.scheduler.client.report [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Updated inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.713 182729 DEBUG nova.compute.provider_tree [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Updating resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.713 182729 DEBUG nova.compute.provider_tree [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.743 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.744 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.843 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.843 182729 DEBUG nova.network.neutron [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.878 182729 INFO nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:15:34 compute-0 nova_compute[182725]: 2026-01-22 22:15:34.907 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:15:35 compute-0 podman[211474]: 2026-01-22 22:15:35.137966743 +0000 UTC m=+0.061739395 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.474 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.477 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.478 182729 INFO nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Creating image(s)
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.479 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Acquiring lock "/var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.480 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "/var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.481 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "/var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.509 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.595 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.596 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.597 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.608 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.675 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.676 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.729 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.730 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.730 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.751 182729 DEBUG nova.network.neutron [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.752 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.820 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.821 182729 DEBUG nova.virt.disk.api [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Checking if we can resize image /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.821 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.872 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.872 182729 DEBUG nova.virt.disk.api [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Cannot resize image /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:15:35 compute-0 nova_compute[182725]: 2026-01-22 22:15:35.873 182729 DEBUG nova.objects.instance [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lazy-loading 'migration_context' on Instance uuid cf308097-1be0-4cc4-ac3f-57f33504bcba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.027 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.028 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Ensure instance console log exists: /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.029 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.029 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.030 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.033 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.040 182729 WARNING nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.047 182729 DEBUG nova.virt.libvirt.host [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.048 182729 DEBUG nova.virt.libvirt.host [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.052 182729 DEBUG nova.virt.libvirt.host [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.053 182729 DEBUG nova.virt.libvirt.host [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.055 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.055 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.056 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.057 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.057 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.057 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.058 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.058 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.059 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.059 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.060 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.060 182729 DEBUG nova.virt.hardware [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.066 182729 DEBUG nova.privsep.utils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.068 182729 DEBUG nova.objects.instance [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid cf308097-1be0-4cc4-ac3f-57f33504bcba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.086 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <uuid>cf308097-1be0-4cc4-ac3f-57f33504bcba</uuid>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <name>instance-00000005</name>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <nova:name>tempest-DeleteServersAdminTestJSON-server-1941017378</nova:name>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:15:36</nova:creationTime>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:15:36 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:15:36 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:15:36 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:15:36 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:15:36 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:15:36 compute-0 nova_compute[182725]:         <nova:user uuid="d4c505f2d67543899e6e04f14c01bac8">tempest-DeleteServersAdminTestJSON-1313849642-project-member</nova:user>
Jan 22 22:15:36 compute-0 nova_compute[182725]:         <nova:project uuid="96f93bebed4a4169b7c9e267d072c1e7">tempest-DeleteServersAdminTestJSON-1313849642</nova:project>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <system>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <entry name="serial">cf308097-1be0-4cc4-ac3f-57f33504bcba</entry>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <entry name="uuid">cf308097-1be0-4cc4-ac3f-57f33504bcba</entry>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </system>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <os>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   </os>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <features>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   </features>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk.config"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/console.log" append="off"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <video>
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </video>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:15:36 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:15:36 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:15:36 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:15:36 compute-0 nova_compute[182725]: </domain>
Jan 22 22:15:36 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.142 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.142 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:15:36 compute-0 nova_compute[182725]: 2026-01-22 22:15:36.143 182729 INFO nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Using config drive
Jan 22 22:15:37 compute-0 nova_compute[182725]: 2026-01-22 22:15:37.293 182729 INFO nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Creating config drive at /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk.config
Jan 22 22:15:37 compute-0 nova_compute[182725]: 2026-01-22 22:15:37.302 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3c61m6w2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:37 compute-0 nova_compute[182725]: 2026-01-22 22:15:37.424 182729 DEBUG oslo_concurrency.processutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3c61m6w2" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:37 compute-0 systemd-machined[154006]: New machine qemu-1-instance-00000005.
Jan 22 22:15:37 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000005.
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.025 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.026 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.030 182729 INFO nova.virt.libvirt.driver [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Instance spawned successfully.
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.030 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.039 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120138.0392926, cf308097-1be0-4cc4-ac3f-57f33504bcba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.040 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] VM Resumed (Lifecycle Event)
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.099 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.104 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.104 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.105 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.106 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.106 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.107 182729 DEBUG nova.virt.libvirt.driver [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.111 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.185 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.186 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120138.0411663, cf308097-1be0-4cc4-ac3f-57f33504bcba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.186 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] VM Started (Lifecycle Event)
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.209 182729 INFO nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Took 2.73 seconds to spawn the instance on the hypervisor.
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.210 182729 DEBUG nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.212 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.220 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.247 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.311 182729 INFO nova.compute.manager [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Took 4.15 seconds to build instance.
Jan 22 22:15:38 compute-0 nova_compute[182725]: 2026-01-22 22:15:38.348 182729 DEBUG oslo_concurrency.lockutils [None req-148001fa-6859-4e74-858e-0e5705fc1ff3 d4c505f2d67543899e6e04f14c01bac8 96f93bebed4a4169b7c9e267d072c1e7 - - default default] Lock "cf308097-1be0-4cc4-ac3f-57f33504bcba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.203 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Acquiring lock "cf308097-1be0-4cc4-ac3f-57f33504bcba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.205 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Lock "cf308097-1be0-4cc4-ac3f-57f33504bcba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.206 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Acquiring lock "cf308097-1be0-4cc4-ac3f-57f33504bcba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.207 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Lock "cf308097-1be0-4cc4-ac3f-57f33504bcba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.207 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Lock "cf308097-1be0-4cc4-ac3f-57f33504bcba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.221 182729 INFO nova.compute.manager [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Terminating instance
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.236 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Acquiring lock "refresh_cache-cf308097-1be0-4cc4-ac3f-57f33504bcba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.237 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Acquired lock "refresh_cache-cf308097-1be0-4cc4-ac3f-57f33504bcba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.237 182729 DEBUG nova.network.neutron [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:15:40 compute-0 nova_compute[182725]: 2026-01-22 22:15:40.489 182729 DEBUG nova.network.neutron [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.483 182729 DEBUG nova.network.neutron [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.504 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Releasing lock "refresh_cache-cf308097-1be0-4cc4-ac3f-57f33504bcba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.505 182729 DEBUG nova.compute.manager [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:15:41 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 22 22:15:41 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Consumed 4.035s CPU time.
Jan 22 22:15:41 compute-0 systemd-machined[154006]: Machine qemu-1-instance-00000005 terminated.
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.756 182729 INFO nova.virt.libvirt.driver [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Instance destroyed successfully.
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.757 182729 DEBUG nova.objects.instance [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Lazy-loading 'resources' on Instance uuid cf308097-1be0-4cc4-ac3f-57f33504bcba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.781 182729 INFO nova.virt.libvirt.driver [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Deleting instance files /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba_del
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.782 182729 INFO nova.virt.libvirt.driver [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Deletion of /var/lib/nova/instances/cf308097-1be0-4cc4-ac3f-57f33504bcba_del complete
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.899 182729 DEBUG nova.virt.libvirt.host [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.899 182729 INFO nova.virt.libvirt.host [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] UEFI support detected
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.901 182729 INFO nova.compute.manager [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.902 182729 DEBUG oslo.service.loopingcall [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.902 182729 DEBUG nova.compute.manager [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:15:41 compute-0 nova_compute[182725]: 2026-01-22 22:15:41.902 182729 DEBUG nova.network.neutron [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.241 182729 DEBUG nova.network.neutron [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.293 182729 DEBUG nova.network.neutron [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.311 182729 INFO nova.compute.manager [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Took 0.41 seconds to deallocate network for instance.
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.446 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.446 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.571 182729 DEBUG nova.compute.provider_tree [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.595 182729 DEBUG nova.scheduler.client.report [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.621 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.664 182729 INFO nova.scheduler.client.report [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Deleted allocations for instance cf308097-1be0-4cc4-ac3f-57f33504bcba
Jan 22 22:15:42 compute-0 nova_compute[182725]: 2026-01-22 22:15:42.782 182729 DEBUG oslo_concurrency.lockutils [None req-13922fda-a509-4afa-a684-4156a2db7514 1e6f27b6433d4e0397336be4396025ec 8abdde5aa6ad4f87b326bd3b46fdafab - - default default] Lock "cf308097-1be0-4cc4-ac3f-57f33504bcba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:45 compute-0 nova_compute[182725]: 2026-01-22 22:15:45.722 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Automatically allocated network: {'id': '0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'name': 'auto_allocated_network', 'tenant_id': 'a5c10a85a248465c960e573d380cd07d', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['6b888f18-e532-4bad-bfe2-a85dc22ccb5f', 'e0e71b11-1b8c-4599-98d4-e9520c16250f'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-22T22:15:29Z', 'updated_at': '2026-01-22T22:15:44Z', 'revision_number': 4, 'project_id': 'a5c10a85a248465c960e573d380cd07d'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Jan 22 22:15:45 compute-0 nova_compute[182725]: 2026-01-22 22:15:45.733 182729 WARNING oslo_policy.policy [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 22 22:15:45 compute-0 nova_compute[182725]: 2026-01-22 22:15:45.734 182729 WARNING oslo_policy.policy [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 22 22:15:45 compute-0 nova_compute[182725]: 2026-01-22 22:15:45.736 182729 DEBUG nova.policy [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd32f3e08e3df4d1ab5b54cafb9d93176', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5c10a85a248465c960e573d380cd07d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:15:46 compute-0 podman[211549]: 2026-01-22 22:15:46.175241986 +0000 UTC m=+0.100968493 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 22:15:47 compute-0 nova_compute[182725]: 2026-01-22 22:15:47.067 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Successfully created port: a329cfea-be66-46f7-a541-7aa6bb72140c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:15:48 compute-0 nova_compute[182725]: 2026-01-22 22:15:48.490 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Successfully updated port: a329cfea-be66-46f7-a541-7aa6bb72140c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:15:48 compute-0 nova_compute[182725]: 2026-01-22 22:15:48.513 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "refresh_cache-0428b0ef-005b-41a0-9d4b-6c37db082797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:15:48 compute-0 nova_compute[182725]: 2026-01-22 22:15:48.513 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquired lock "refresh_cache-0428b0ef-005b-41a0-9d4b-6c37db082797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:15:48 compute-0 nova_compute[182725]: 2026-01-22 22:15:48.514 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:15:48 compute-0 nova_compute[182725]: 2026-01-22 22:15:48.939 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:15:49 compute-0 nova_compute[182725]: 2026-01-22 22:15:49.201 182729 DEBUG nova.compute.manager [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received event network-changed-a329cfea-be66-46f7-a541-7aa6bb72140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:15:49 compute-0 nova_compute[182725]: 2026-01-22 22:15:49.202 182729 DEBUG nova.compute.manager [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Refreshing instance network info cache due to event network-changed-a329cfea-be66-46f7-a541-7aa6bb72140c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:15:49 compute-0 nova_compute[182725]: 2026-01-22 22:15:49.202 182729 DEBUG oslo_concurrency.lockutils [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-0428b0ef-005b-41a0-9d4b-6c37db082797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.298 182729 DEBUG nova.network.neutron [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Updating instance_info_cache with network_info: [{"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.327 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Releasing lock "refresh_cache-0428b0ef-005b-41a0-9d4b-6c37db082797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.328 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Instance network_info: |[{"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.328 182729 DEBUG oslo_concurrency.lockutils [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-0428b0ef-005b-41a0-9d4b-6c37db082797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.329 182729 DEBUG nova.network.neutron [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Refreshing network info cache for port a329cfea-be66-46f7-a541-7aa6bb72140c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.332 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Start _get_guest_xml network_info=[{"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.337 182729 WARNING nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.341 182729 DEBUG nova.virt.libvirt.host [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.342 182729 DEBUG nova.virt.libvirt.host [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.347 182729 DEBUG nova.virt.libvirt.host [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.347 182729 DEBUG nova.virt.libvirt.host [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.348 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.349 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.349 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.349 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.350 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.350 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.350 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.350 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.351 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.351 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.351 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.351 182729 DEBUG nova.virt.hardware [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.354 182729 DEBUG nova.virt.libvirt.vif [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-662460332-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662460332-1',id=2,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5c10a85a248465c960e573d380cd07d',ramdisk_id='',reservation_id='r-0ajzb810',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-2038642621',owner_user_name='tempest-AutoAllocateNetworkTest-2038642621-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:15:26Z,user_data=None,user_id='d32f3e08e3df4d1ab5b54cafb9d93176',uuid=0428b0ef-005b-41a0-9d4b-6c37db082797,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.355 182729 DEBUG nova.network.os_vif_util [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converting VIF {"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.356 182729 DEBUG nova.network.os_vif_util [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:79:c3,bridge_name='br-int',has_traffic_filtering=True,id=a329cfea-be66-46f7-a541-7aa6bb72140c,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa329cfea-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.358 182729 DEBUG nova.objects.instance [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lazy-loading 'pci_devices' on Instance uuid 0428b0ef-005b-41a0-9d4b-6c37db082797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.374 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <uuid>0428b0ef-005b-41a0-9d4b-6c37db082797</uuid>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <name>instance-00000002</name>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <nova:name>tempest-tempest.common.compute-instance-662460332-1</nova:name>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:15:51</nova:creationTime>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:user uuid="d32f3e08e3df4d1ab5b54cafb9d93176">tempest-AutoAllocateNetworkTest-2038642621-project-member</nova:user>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:project uuid="a5c10a85a248465c960e573d380cd07d">tempest-AutoAllocateNetworkTest-2038642621</nova:project>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         <nova:port uuid="a329cfea-be66-46f7-a541-7aa6bb72140c">
Jan 22 22:15:51 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.1.0.150" ipVersion="4"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="fdfe:381f:8400:2::1bd" ipVersion="6"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <system>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <entry name="serial">0428b0ef-005b-41a0-9d4b-6c37db082797</entry>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <entry name="uuid">0428b0ef-005b-41a0-9d4b-6c37db082797</entry>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </system>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <os>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   </os>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <features>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   </features>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk.config"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:fe:79:c3"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <target dev="tapa329cfea-be"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/console.log" append="off"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <video>
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </video>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:15:51 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:15:51 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:15:51 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:15:51 compute-0 nova_compute[182725]: </domain>
Jan 22 22:15:51 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.375 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Preparing to wait for external event network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.376 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.376 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.376 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.377 182729 DEBUG nova.virt.libvirt.vif [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-662460332-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662460332-1',id=2,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5c10a85a248465c960e573d380cd07d',ramdisk_id='',reservation_id='r-0ajzb810',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-2038642621',owner_user_name='tempest-AutoAllocateNetworkTest-2038642621-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:15:26Z,user_data=None,user_id='d32f3e08e3df4d1ab5b54cafb9d93176',uuid=0428b0ef-005b-41a0-9d4b-6c37db082797,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.377 182729 DEBUG nova.network.os_vif_util [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converting VIF {"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.378 182729 DEBUG nova.network.os_vif_util [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:79:c3,bridge_name='br-int',has_traffic_filtering=True,id=a329cfea-be66-46f7-a541-7aa6bb72140c,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa329cfea-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.379 182729 DEBUG os_vif [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:79:c3,bridge_name='br-int',has_traffic_filtering=True,id=a329cfea-be66-46f7-a541-7aa6bb72140c,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa329cfea-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.411 182729 DEBUG ovsdbapp.backend.ovs_idl [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.412 182729 DEBUG ovsdbapp.backend.ovs_idl [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.412 182729 DEBUG ovsdbapp.backend.ovs_idl [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.413 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.413 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [POLLOUT] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.414 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.414 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.417 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.419 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.427 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.427 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.428 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:15:51 compute-0 nova_compute[182725]: 2026-01-22 22:15:51.429 182729 INFO oslo.privsep.daemon [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpz_winaua/privsep.sock']
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.237 182729 INFO oslo.privsep.daemon [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Spawned new privsep daemon via rootwrap
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.039 211574 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.044 211574 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.046 211574 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.046 211574 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211574
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.555 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.556 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa329cfea-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.557 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa329cfea-be, col_values=(('external_ids', {'iface-id': 'a329cfea-be66-46f7-a541-7aa6bb72140c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:79:c3', 'vm-uuid': '0428b0ef-005b-41a0-9d4b-6c37db082797'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:15:52 compute-0 NetworkManager[54954]: <info>  [1769120152.5605] manager: (tapa329cfea-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.561 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.568 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.569 182729 INFO os_vif [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:79:c3,bridge_name='br-int',has_traffic_filtering=True,id=a329cfea-be66-46f7-a541-7aa6bb72140c,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa329cfea-be')
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.646 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.646 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.646 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] No VIF found with MAC fa:16:3e:fe:79:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:15:52 compute-0 nova_compute[182725]: 2026-01-22 22:15:52.647 182729 INFO nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Using config drive
Jan 22 22:15:52 compute-0 podman[211582]: 2026-01-22 22:15:52.678943713 +0000 UTC m=+0.066231945 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Jan 22 22:15:52 compute-0 podman[211581]: 2026-01-22 22:15:52.706172115 +0000 UTC m=+0.095806465 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.426 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.477 182729 INFO nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Creating config drive at /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk.config
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.481 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnr9vuc83 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.608 182729 DEBUG oslo_concurrency.processutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnr9vuc83" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:15:53 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 22 22:15:53 compute-0 kernel: tapa329cfea-be: entered promiscuous mode
Jan 22 22:15:53 compute-0 NetworkManager[54954]: <info>  [1769120153.6911] manager: (tapa329cfea-be): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Jan 22 22:15:53 compute-0 ovn_controller[94850]: 2026-01-22T22:15:53Z|00027|binding|INFO|Claiming lport a329cfea-be66-46f7-a541-7aa6bb72140c for this chassis.
Jan 22 22:15:53 compute-0 ovn_controller[94850]: 2026-01-22T22:15:53Z|00028|binding|INFO|a329cfea-be66-46f7-a541-7aa6bb72140c: Claiming fa:16:3e:fe:79:c3 10.1.0.150 fdfe:381f:8400:2::1bd
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.693 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.697 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:53.713 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:79:c3 10.1.0.150 fdfe:381f:8400:2::1bd'], port_security=['fa:16:3e:fe:79:c3 10.1.0.150 fdfe:381f:8400:2::1bd'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.150/26 fdfe:381f:8400:2::1bd/64', 'neutron:device_id': '0428b0ef-005b-41a0-9d4b-6c37db082797', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5c10a85a248465c960e573d380cd07d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a4e4ec01-67f2-4a9a-8ea0-8e8df5ac239e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44f330dd-5be1-46a4-b9d9-2996e37af063, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=a329cfea-be66-46f7-a541-7aa6bb72140c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:15:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:53.715 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a329cfea-be66-46f7-a541-7aa6bb72140c in datapath 0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 bound to our chassis
Jan 22 22:15:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:53.717 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1
Jan 22 22:15:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:53.719 104215 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpldegw9sj/privsep.sock']
Jan 22 22:15:53 compute-0 systemd-udevd[211646]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:15:53 compute-0 NetworkManager[54954]: <info>  [1769120153.7501] device (tapa329cfea-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:15:53 compute-0 NetworkManager[54954]: <info>  [1769120153.7506] device (tapa329cfea-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:15:53 compute-0 systemd-machined[154006]: New machine qemu-2-instance-00000002.
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.767 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:53 compute-0 ovn_controller[94850]: 2026-01-22T22:15:53Z|00029|binding|INFO|Setting lport a329cfea-be66-46f7-a541-7aa6bb72140c ovn-installed in OVS
Jan 22 22:15:53 compute-0 ovn_controller[94850]: 2026-01-22T22:15:53Z|00030|binding|INFO|Setting lport a329cfea-be66-46f7-a541-7aa6bb72140c up in Southbound
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.775 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:53 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.877 182729 DEBUG nova.network.neutron [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Updated VIF entry in instance network info cache for port a329cfea-be66-46f7-a541-7aa6bb72140c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.877 182729 DEBUG nova.network.neutron [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Updating instance_info_cache with network_info: [{"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:15:53 compute-0 nova_compute[182725]: 2026-01-22 22:15:53.902 182729 DEBUG oslo_concurrency.lockutils [req-910ec64f-f81d-4e64-a261-1d4aed5e776a req-bc0cf956-bd1f-4000-a567-ebcb0f3eb6e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-0428b0ef-005b-41a0-9d4b-6c37db082797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.356 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120154.3552494, 0428b0ef-005b-41a0-9d4b-6c37db082797 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.357 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] VM Started (Lifecycle Event)
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.388 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.396 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120154.3558657, 0428b0ef-005b-41a0-9d4b-6c37db082797 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.396 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] VM Paused (Lifecycle Event)
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.416 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.421 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.440 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.494 104215 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.495 104215 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpldegw9sj/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.327 211671 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.332 211671 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.334 211671 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.334 211671 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211671
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.499 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bf6ad6-e22e-4504-8f19-228f5c76f412]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.924 182729 DEBUG nova.compute.manager [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received event network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.925 182729 DEBUG oslo_concurrency.lockutils [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.926 182729 DEBUG oslo_concurrency.lockutils [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.926 182729 DEBUG oslo_concurrency.lockutils [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.927 182729 DEBUG nova.compute.manager [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Processing event network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.927 182729 DEBUG nova.compute.manager [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received event network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.928 182729 DEBUG oslo_concurrency.lockutils [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.928 182729 DEBUG oslo_concurrency.lockutils [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.928 182729 DEBUG oslo_concurrency.lockutils [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.929 182729 DEBUG nova.compute.manager [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] No waiting events found dispatching network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.929 182729 WARNING nova.compute.manager [req-3094d5ed-aca0-4916-9c78-db0b71cdf420 req-6a34b5c3-7ee6-4f9e-a6d0-8173d30293f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received unexpected event network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c for instance with vm_state building and task_state spawning.
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.931 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.936 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120154.9366453, 0428b0ef-005b-41a0-9d4b-6c37db082797 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.937 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] VM Resumed (Lifecycle Event)
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.942 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.946 182729 INFO nova.virt.libvirt.driver [-] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Instance spawned successfully.
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.946 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.976 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.977 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.979 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.978 211671 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.980 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.978 211671 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.981 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:54.978 211671 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.982 182729 DEBUG nova.virt.libvirt.driver [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.989 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:54 compute-0 nova_compute[182725]: 2026-01-22 22:15:54.994 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:15:55 compute-0 nova_compute[182725]: 2026-01-22 22:15:55.037 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:15:55 compute-0 nova_compute[182725]: 2026-01-22 22:15:55.079 182729 INFO nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Took 28.20 seconds to spawn the instance on the hypervisor.
Jan 22 22:15:55 compute-0 nova_compute[182725]: 2026-01-22 22:15:55.081 182729 DEBUG nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:55 compute-0 nova_compute[182725]: 2026-01-22 22:15:55.187 182729 INFO nova.compute.manager [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Took 29.04 seconds to build instance.
Jan 22 22:15:55 compute-0 nova_compute[182725]: 2026-01-22 22:15:55.220 182729 DEBUG oslo_concurrency.lockutils [None req-2436e764-b1c0-4a73-bfd0-f3d703df72a7 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 29.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.518 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[91e9874b-bd29-4d8a-9aac-b18c7ef52018]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.520 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0ae2c5c3-f1 in ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.522 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0ae2c5c3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.522 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[eda63adc-b245-4c75-a351-397d4bf2bf8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.526 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[45eafaab-70f8-4db8-8574-ed6e2a47ab37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.563 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[8e51ec70-9469-4630-9602-709380ae5fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.595 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[45d2c335-c2be-47f5-87bc-5f0922105ddf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:55.598 104215 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpl6jaq26l/privsep.sock']
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.315 104215 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.317 104215 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpl6jaq26l/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.185 211685 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.192 211685 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.196 211685 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.196 211685 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211685
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.321 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef391d4-e84a-4497-8285-cd793a925324]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:56 compute-0 nova_compute[182725]: 2026-01-22 22:15:56.755 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120141.7544405, cf308097-1be0-4cc4-ac3f-57f33504bcba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:15:56 compute-0 nova_compute[182725]: 2026-01-22 22:15:56.757 182729 INFO nova.compute.manager [-] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] VM Stopped (Lifecycle Event)
Jan 22 22:15:56 compute-0 nova_compute[182725]: 2026-01-22 22:15:56.793 182729 DEBUG nova.compute.manager [None req-88887a6f-f77e-4d5d-96f7-9f5e334f42dc - - - - - -] [instance: cf308097-1be0-4cc4-ac3f-57f33504bcba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.906 211685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.906 211685 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:15:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:56.906 211685 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.525 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c398bafc-cc01-4af5-aa37-e7e3d3390fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.555 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e9bf9481-8236-401c-95ff-f007c70df11b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 NetworkManager[54954]: <info>  [1769120157.5573] manager: (tap0ae2c5c3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Jan 22 22:15:57 compute-0 nova_compute[182725]: 2026-01-22 22:15:57.559 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:57 compute-0 systemd-udevd[211697]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.595 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[10c8a16c-2c88-4dda-974e-b06f1c06a91d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.600 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[5aeb34f2-1d68-4f13-ab1a-da092dd26ecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 NetworkManager[54954]: <info>  [1769120157.6313] device (tap0ae2c5c3-f0): carrier: link connected
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.635 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c72a4e0f-d425-432c-9a0d-9056078b2947]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.662 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc9ea22-e0cd-414b-b1b1-3f0d3256b222]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ae2c5c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:21:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379523, 'reachable_time': 27442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211715, 'error': None, 'target': 'ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.682 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[950d2339-a866-448e-a874-1d6c02c71a97]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:2168'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379523, 'tstamp': 379523}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211716, 'error': None, 'target': 'ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.702 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b119d2a2-a729-49a0-ae6f-ed592cae59e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ae2c5c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:21:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379523, 'reachable_time': 27442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211717, 'error': None, 'target': 'ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.742 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[571ee2f6-f51a-41dc-9a95-26c3666cce1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.815 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6970dd4e-1464-4b35-a4a4-56022eef9efa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.817 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ae2c5c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.817 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.818 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ae2c5c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:15:57 compute-0 NetworkManager[54954]: <info>  [1769120157.8214] manager: (tap0ae2c5c3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 22 22:15:57 compute-0 kernel: tap0ae2c5c3-f0: entered promiscuous mode
Jan 22 22:15:57 compute-0 nova_compute[182725]: 2026-01-22 22:15:57.822 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.824 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0ae2c5c3-f0, col_values=(('external_ids', {'iface-id': '4d09bcda-99b3-46bb-b0f3-d0ad2142e948'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:15:57 compute-0 nova_compute[182725]: 2026-01-22 22:15:57.826 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:57 compute-0 ovn_controller[94850]: 2026-01-22T22:15:57Z|00031|binding|INFO|Releasing lport 4d09bcda-99b3-46bb-b0f3-d0ad2142e948 from this chassis (sb_readonly=0)
Jan 22 22:15:57 compute-0 nova_compute[182725]: 2026-01-22 22:15:57.849 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.850 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.851 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[73bd833a-8538-4c0e-89cb-a577a999972b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.853 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1.pid.haproxy
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:15:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:15:57.854 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'env', 'PROCESS_TAG=haproxy-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:15:58 compute-0 podman[211750]: 2026-01-22 22:15:58.281911004 +0000 UTC m=+0.060191986 container create 7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:15:58 compute-0 systemd[1]: Started libpod-conmon-7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c.scope.
Jan 22 22:15:58 compute-0 podman[211750]: 2026-01-22 22:15:58.244960403 +0000 UTC m=+0.023241415 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:15:58 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:15:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7dc38543ca134f41fa5a14e9648b739e6ac692928ca9f4c6ee1dc0f96f9c642/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:15:58 compute-0 podman[211750]: 2026-01-22 22:15:58.399949707 +0000 UTC m=+0.178230729 container init 7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:15:58 compute-0 podman[211750]: 2026-01-22 22:15:58.405618946 +0000 UTC m=+0.183899928 container start 7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:15:58 compute-0 nova_compute[182725]: 2026-01-22 22:15:58.427 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:15:58 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [NOTICE]   (211770) : New worker (211772) forked
Jan 22 22:15:58 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [NOTICE]   (211770) : Loading success.
Jan 22 22:16:00 compute-0 podman[211781]: 2026-01-22 22:16:00.137395604 +0000 UTC m=+0.065775014 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:16:00 compute-0 podman[211782]: 2026-01-22 22:16:00.155858199 +0000 UTC m=+0.079140293 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.033 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.036 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.037 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.037 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.038 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.063 182729 INFO nova.compute.manager [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Terminating instance
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.234 182729 DEBUG nova.compute.manager [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:16:02 compute-0 kernel: tapa329cfea-be (unregistering): left promiscuous mode
Jan 22 22:16:02 compute-0 NetworkManager[54954]: <info>  [1769120162.2660] device (tapa329cfea-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:16:02 compute-0 ovn_controller[94850]: 2026-01-22T22:16:02Z|00032|binding|INFO|Releasing lport a329cfea-be66-46f7-a541-7aa6bb72140c from this chassis (sb_readonly=0)
Jan 22 22:16:02 compute-0 ovn_controller[94850]: 2026-01-22T22:16:02Z|00033|binding|INFO|Setting lport a329cfea-be66-46f7-a541-7aa6bb72140c down in Southbound
Jan 22 22:16:02 compute-0 ovn_controller[94850]: 2026-01-22T22:16:02Z|00034|binding|INFO|Removing iface tapa329cfea-be ovn-installed in OVS
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.274 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.277 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.284 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:79:c3 10.1.0.150 fdfe:381f:8400:2::1bd'], port_security=['fa:16:3e:fe:79:c3 10.1.0.150 fdfe:381f:8400:2::1bd'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.150/26 fdfe:381f:8400:2::1bd/64', 'neutron:device_id': '0428b0ef-005b-41a0-9d4b-6c37db082797', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5c10a85a248465c960e573d380cd07d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4e4ec01-67f2-4a9a-8ea0-8e8df5ac239e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44f330dd-5be1-46a4-b9d9-2996e37af063, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=a329cfea-be66-46f7-a541-7aa6bb72140c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.286 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a329cfea-be66-46f7-a541-7aa6bb72140c in datapath 0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 unbound from our chassis
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.287 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.288 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[23b4ffb8-025c-4d62-998d-4b4b1755c10b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.292 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 namespace which is not needed anymore
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.294 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 22 22:16:02 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 8.056s CPU time.
Jan 22 22:16:02 compute-0 systemd-machined[154006]: Machine qemu-2-instance-00000002 terminated.
Jan 22 22:16:02 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [NOTICE]   (211770) : haproxy version is 2.8.14-c23fe91
Jan 22 22:16:02 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [NOTICE]   (211770) : path to executable is /usr/sbin/haproxy
Jan 22 22:16:02 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [WARNING]  (211770) : Exiting Master process...
Jan 22 22:16:02 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [WARNING]  (211770) : Exiting Master process...
Jan 22 22:16:02 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [ALERT]    (211770) : Current worker (211772) exited with code 143 (Terminated)
Jan 22 22:16:02 compute-0 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211766]: [WARNING]  (211770) : All workers exited. Exiting... (0)
Jan 22 22:16:02 compute-0 systemd[1]: libpod-7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c.scope: Deactivated successfully.
Jan 22 22:16:02 compute-0 podman[211846]: 2026-01-22 22:16:02.446419015 +0000 UTC m=+0.055101461 container died 7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.458 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.463 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c-userdata-shm.mount: Deactivated successfully.
Jan 22 22:16:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-b7dc38543ca134f41fa5a14e9648b739e6ac692928ca9f4c6ee1dc0f96f9c642-merged.mount: Deactivated successfully.
Jan 22 22:16:02 compute-0 podman[211846]: 2026-01-22 22:16:02.486293569 +0000 UTC m=+0.094976015 container cleanup 7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.500 182729 INFO nova.virt.libvirt.driver [-] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Instance destroyed successfully.
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.501 182729 DEBUG nova.objects.instance [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lazy-loading 'resources' on Instance uuid 0428b0ef-005b-41a0-9d4b-6c37db082797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:16:02 compute-0 systemd[1]: libpod-conmon-7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c.scope: Deactivated successfully.
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.518 182729 DEBUG nova.virt.libvirt.vif [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-662460332-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662460332-1',id=2,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:15:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5c10a85a248465c960e573d380cd07d',ramdisk_id='',reservation_id='r-0ajzb810',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-2038642621',owner_user_name='tempest-AutoAllocateNetworkTest-2038642621-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:15:55Z,user_data=None,user_id='d32f3e08e3df4d1ab5b54cafb9d93176',uuid=0428b0ef-005b-41a0-9d4b-6c37db082797,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.518 182729 DEBUG nova.network.os_vif_util [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converting VIF {"id": "a329cfea-be66-46f7-a541-7aa6bb72140c", "address": "fa:16:3e:fe:79:c3", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::1bd", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa329cfea-be", "ovs_interfaceid": "a329cfea-be66-46f7-a541-7aa6bb72140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.519 182729 DEBUG nova.network.os_vif_util [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:79:c3,bridge_name='br-int',has_traffic_filtering=True,id=a329cfea-be66-46f7-a541-7aa6bb72140c,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa329cfea-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.520 182729 DEBUG os_vif [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:79:c3,bridge_name='br-int',has_traffic_filtering=True,id=a329cfea-be66-46f7-a541-7aa6bb72140c,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa329cfea-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.522 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.522 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa329cfea-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.525 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.526 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.528 182729 INFO os_vif [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:79:c3,bridge_name='br-int',has_traffic_filtering=True,id=a329cfea-be66-46f7-a541-7aa6bb72140c,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa329cfea-be')
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.529 182729 INFO nova.virt.libvirt.driver [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Deleting instance files /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797_del
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.530 182729 INFO nova.virt.libvirt.driver [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Deletion of /var/lib/nova/instances/0428b0ef-005b-41a0-9d4b-6c37db082797_del complete
Jan 22 22:16:02 compute-0 podman[211890]: 2026-01-22 22:16:02.562683513 +0000 UTC m=+0.044729744 container remove 7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.568 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fe281a2b-e9d6-41bf-9d4e-f96ef8a508ae]: (4, ('Thu Jan 22 10:16:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 (7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c)\n7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c\nThu Jan 22 10:16:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 (7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c)\n7d2a3a6d003e9a2c5fcb697de21669d6cd8cd57ebd095d5ef69b2ae3c763433c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.571 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[52cec17a-b008-49dd-ac92-01c63217ff6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.572 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ae2c5c3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.574 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 kernel: tap0ae2c5c3-f0: left promiscuous mode
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.579 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[232967ac-1c96-473b-979f-30aafaf38a09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.589 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.600 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5959cfb3-edb7-4c85-8e58-39c76b2c1e3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.602 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b9052766-49f4-4523-b6a3-2f77dd646c42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.619 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4fabf085-cf1d-4021-9219-ca01757a9621]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379511, 'reachable_time': 43942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211905, 'error': None, 'target': 'ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.636 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:16:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:02.637 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f0e879-8074-45b5-bf05-8be87007ed3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d0ae2c5c3\x2df5e5\x2d49e2\x2db4c8\x2db3ced0d580c1.mount: Deactivated successfully.
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.658 182729 INFO nova.compute.manager [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.660 182729 DEBUG oslo.service.loopingcall [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.660 182729 DEBUG nova.compute.manager [-] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.661 182729 DEBUG nova.network.neutron [-] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.918 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.919 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:16:02 compute-0 nova_compute[182725]: 2026-01-22 22:16:02.923 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.377 182729 DEBUG nova.compute.manager [req-5dcf4bf7-2ef0-475a-9185-4e2901bfc129 req-079fdbf0-abc5-442b-814d-52beaad6622a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received event network-vif-unplugged-a329cfea-be66-46f7-a541-7aa6bb72140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.378 182729 DEBUG oslo_concurrency.lockutils [req-5dcf4bf7-2ef0-475a-9185-4e2901bfc129 req-079fdbf0-abc5-442b-814d-52beaad6622a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.378 182729 DEBUG oslo_concurrency.lockutils [req-5dcf4bf7-2ef0-475a-9185-4e2901bfc129 req-079fdbf0-abc5-442b-814d-52beaad6622a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.379 182729 DEBUG oslo_concurrency.lockutils [req-5dcf4bf7-2ef0-475a-9185-4e2901bfc129 req-079fdbf0-abc5-442b-814d-52beaad6622a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.379 182729 DEBUG nova.compute.manager [req-5dcf4bf7-2ef0-475a-9185-4e2901bfc129 req-079fdbf0-abc5-442b-814d-52beaad6622a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] No waiting events found dispatching network-vif-unplugged-a329cfea-be66-46f7-a541-7aa6bb72140c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.380 182729 DEBUG nova.compute.manager [req-5dcf4bf7-2ef0-475a-9185-4e2901bfc129 req-079fdbf0-abc5-442b-814d-52beaad6622a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received event network-vif-unplugged-a329cfea-be66-46f7-a541-7aa6bb72140c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.431 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.916 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:03 compute-0 nova_compute[182725]: 2026-01-22 22:16:03.916 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.121 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.123 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.38351058959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.123 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.124 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.245 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 0428b0ef-005b-41a0-9d4b-6c37db082797 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.246 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.246 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.355 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.383 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.420 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:16:04 compute-0 nova_compute[182725]: 2026-01-22 22:16:04.421 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.008 182729 DEBUG nova.network.neutron [-] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.075 182729 INFO nova.compute.manager [-] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Took 2.41 seconds to deallocate network for instance.
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.195 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.196 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.285 182729 DEBUG nova.compute.provider_tree [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.309 182729 DEBUG nova.scheduler.client.report [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.337 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.393 182729 INFO nova.scheduler.client.report [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Deleted allocations for instance 0428b0ef-005b-41a0-9d4b-6c37db082797
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.419 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.635 182729 DEBUG oslo_concurrency.lockutils [None req-7f1959d0-8dda-4612-a75c-b1b7dbcec720 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.988 182729 DEBUG nova.compute.manager [req-d780a625-aadf-4260-9275-f91a8ca70f11 req-536c8c7d-cf33-4534-a111-fe74ab12ff1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received event network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.989 182729 DEBUG oslo_concurrency.lockutils [req-d780a625-aadf-4260-9275-f91a8ca70f11 req-536c8c7d-cf33-4534-a111-fe74ab12ff1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.989 182729 DEBUG oslo_concurrency.lockutils [req-d780a625-aadf-4260-9275-f91a8ca70f11 req-536c8c7d-cf33-4534-a111-fe74ab12ff1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.989 182729 DEBUG oslo_concurrency.lockutils [req-d780a625-aadf-4260-9275-f91a8ca70f11 req-536c8c7d-cf33-4534-a111-fe74ab12ff1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0428b0ef-005b-41a0-9d4b-6c37db082797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.990 182729 DEBUG nova.compute.manager [req-d780a625-aadf-4260-9275-f91a8ca70f11 req-536c8c7d-cf33-4534-a111-fe74ab12ff1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] No waiting events found dispatching network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:05 compute-0 nova_compute[182725]: 2026-01-22 22:16:05.990 182729 WARNING nova.compute.manager [req-d780a625-aadf-4260-9275-f91a8ca70f11 req-536c8c7d-cf33-4534-a111-fe74ab12ff1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received unexpected event network-vif-plugged-a329cfea-be66-46f7-a541-7aa6bb72140c for instance with vm_state deleted and task_state None.
Jan 22 22:16:06 compute-0 nova_compute[182725]: 2026-01-22 22:16:06.085 182729 DEBUG nova.compute.manager [req-2908fe1f-8f95-4584-87ae-242aae16ecbd req-f8dfbd6f-4ff2-4fe6-b4e0-ec74339f601c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Received event network-vif-deleted-a329cfea-be66-46f7-a541-7aa6bb72140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:06 compute-0 podman[211909]: 2026-01-22 22:16:06.152150766 +0000 UTC m=+0.077046492 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:16:06 compute-0 nova_compute[182725]: 2026-01-22 22:16:06.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:06 compute-0 nova_compute[182725]: 2026-01-22 22:16:06.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:06 compute-0 nova_compute[182725]: 2026-01-22 22:16:06.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:06 compute-0 nova_compute[182725]: 2026-01-22 22:16:06.892 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:16:07 compute-0 nova_compute[182725]: 2026-01-22 22:16:07.527 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:07 compute-0 nova_compute[182725]: 2026-01-22 22:16:07.892 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:16:08 compute-0 nova_compute[182725]: 2026-01-22 22:16:08.434 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:16:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:16:09 compute-0 rsyslogd[1008]: imjournal: 1646 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 22 22:16:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:12.423 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:12.424 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:12.424 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:12 compute-0 nova_compute[182725]: 2026-01-22 22:16:12.530 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:13 compute-0 nova_compute[182725]: 2026-01-22 22:16:13.436 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:13.597 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:16:13 compute-0 nova_compute[182725]: 2026-01-22 22:16:13.598 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:13.598 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:16:17 compute-0 nova_compute[182725]: 2026-01-22 22:16:17.072 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:17 compute-0 podman[211935]: 2026-01-22 22:16:17.162024461 +0000 UTC m=+0.075074294 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 22 22:16:17 compute-0 nova_compute[182725]: 2026-01-22 22:16:17.499 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120162.4982572, 0428b0ef-005b-41a0-9d4b-6c37db082797 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:16:17 compute-0 nova_compute[182725]: 2026-01-22 22:16:17.500 182729 INFO nova.compute.manager [-] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] VM Stopped (Lifecycle Event)
Jan 22 22:16:17 compute-0 nova_compute[182725]: 2026-01-22 22:16:17.520 182729 DEBUG nova.compute.manager [None req-619df4f9-19ea-42b8-88b2-938c0aa28909 - - - - - -] [instance: 0428b0ef-005b-41a0-9d4b-6c37db082797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:16:17 compute-0 nova_compute[182725]: 2026-01-22 22:16:17.532 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:18 compute-0 nova_compute[182725]: 2026-01-22 22:16:18.438 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:18.600 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:22 compute-0 nova_compute[182725]: 2026-01-22 22:16:22.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:23 compute-0 podman[211958]: 2026-01-22 22:16:23.206348252 +0000 UTC m=+0.127181789 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 22:16:23 compute-0 podman[211957]: 2026-01-22 22:16:23.209733986 +0000 UTC m=+0.136266674 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 22:16:23 compute-0 nova_compute[182725]: 2026-01-22 22:16:23.439 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:27 compute-0 nova_compute[182725]: 2026-01-22 22:16:27.542 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:28 compute-0 nova_compute[182725]: 2026-01-22 22:16:28.441 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:31 compute-0 podman[212006]: 2026-01-22 22:16:31.113090124 +0000 UTC m=+0.051128253 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:16:31 compute-0 podman[212005]: 2026-01-22 22:16:31.120364323 +0000 UTC m=+0.058988526 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.547 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.760 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.760 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.795 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.939 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.940 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.950 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:16:32 compute-0 nova_compute[182725]: 2026-01-22 22:16:32.950 182729 INFO nova.compute.claims [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.074 182729 DEBUG nova.compute.provider_tree [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.090 182729 DEBUG nova.scheduler.client.report [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.112 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.114 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.165 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.166 182729 DEBUG nova.network.neutron [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.180 182729 INFO nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.198 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.286 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.288 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.289 182729 INFO nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating image(s)
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.289 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.290 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.291 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.309 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.382 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.384 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.385 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.401 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.443 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.460 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.461 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.512 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.514 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.515 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.576 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.578 182729 DEBUG nova.virt.disk.api [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Checking if we can resize image /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.579 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.643 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.645 182729 DEBUG nova.virt.disk.api [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Cannot resize image /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.645 182729 DEBUG nova.objects.instance [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lazy-loading 'migration_context' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.661 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.662 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Ensure instance console log exists: /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.662 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.663 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.664 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:33 compute-0 nova_compute[182725]: 2026-01-22 22:16:33.761 182729 DEBUG nova.policy [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:16:34 compute-0 nova_compute[182725]: 2026-01-22 22:16:34.649 182729 DEBUG nova.network.neutron [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Successfully created port: 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:16:35 compute-0 nova_compute[182725]: 2026-01-22 22:16:35.842 182729 DEBUG nova.network.neutron [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Successfully updated port: 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:16:35 compute-0 nova_compute[182725]: 2026-01-22 22:16:35.863 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:16:35 compute-0 nova_compute[182725]: 2026-01-22 22:16:35.864 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:16:35 compute-0 nova_compute[182725]: 2026-01-22 22:16:35.864 182729 DEBUG nova.network.neutron [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:16:36 compute-0 nova_compute[182725]: 2026-01-22 22:16:36.071 182729 DEBUG nova.compute.manager [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-changed-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:36 compute-0 nova_compute[182725]: 2026-01-22 22:16:36.072 182729 DEBUG nova.compute.manager [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Refreshing instance network info cache due to event network-changed-3cbb0272-18e2-4845-aa69-d6a35ecb0d03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:16:36 compute-0 nova_compute[182725]: 2026-01-22 22:16:36.072 182729 DEBUG oslo_concurrency.lockutils [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:16:36 compute-0 nova_compute[182725]: 2026-01-22 22:16:36.134 182729 DEBUG nova.network.neutron [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:16:37 compute-0 podman[212062]: 2026-01-22 22:16:37.16777228 +0000 UTC m=+0.088055534 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.551 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.554 182729 DEBUG nova.network.neutron [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.578 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.579 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Instance network_info: |[{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.580 182729 DEBUG oslo_concurrency.lockutils [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.580 182729 DEBUG nova.network.neutron [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Refreshing network info cache for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.584 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Start _get_guest_xml network_info=[{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.589 182729 WARNING nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.597 182729 DEBUG nova.virt.libvirt.host [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.597 182729 DEBUG nova.virt.libvirt.host [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.606 182729 DEBUG nova.virt.libvirt.host [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.607 182729 DEBUG nova.virt.libvirt.host [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.608 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.609 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.609 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.609 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.610 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.610 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.610 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.610 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.611 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.611 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.611 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.612 182729 DEBUG nova.virt.hardware [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.616 182729 DEBUG nova.virt.libvirt.vif [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:16:33Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.616 182729 DEBUG nova.network.os_vif_util [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.617 182729 DEBUG nova.network.os_vif_util [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.618 182729 DEBUG nova.objects.instance [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.635 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <uuid>1c2458ea-22d6-480f-ae75-5f050eb08b2b</uuid>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <name>instance-00000007</name>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-489483157</nova:name>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:16:37</nova:creationTime>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:user uuid="f591d36af603475bbc613d6c93854a42">tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member</nova:user>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:project uuid="4ff5f7f17f1c471986dfd67f5192359f">tempest-LiveAutoBlockMigrationV225Test-1833907945</nova:project>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         <nova:port uuid="3cbb0272-18e2-4845-aa69-d6a35ecb0d03">
Jan 22 22:16:37 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <system>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <entry name="serial">1c2458ea-22d6-480f-ae75-5f050eb08b2b</entry>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <entry name="uuid">1c2458ea-22d6-480f-ae75-5f050eb08b2b</entry>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </system>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <os>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   </os>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <features>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   </features>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:8f:6c:2e"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <target dev="tap3cbb0272-18"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/console.log" append="off"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <video>
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </video>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:16:37 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:16:37 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:16:37 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:16:37 compute-0 nova_compute[182725]: </domain>
Jan 22 22:16:37 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.637 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Preparing to wait for external event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.638 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.639 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.639 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.640 182729 DEBUG nova.virt.libvirt.vif [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:16:33Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.641 182729 DEBUG nova.network.os_vif_util [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.643 182729 DEBUG nova.network.os_vif_util [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.643 182729 DEBUG os_vif [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.645 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.645 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.646 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.651 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.651 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3cbb0272-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.652 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3cbb0272-18, col_values=(('external_ids', {'iface-id': '3cbb0272-18e2-4845-aa69-d6a35ecb0d03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:6c:2e', 'vm-uuid': '1c2458ea-22d6-480f-ae75-5f050eb08b2b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.654 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:37 compute-0 NetworkManager[54954]: <info>  [1769120197.6559] manager: (tap3cbb0272-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.657 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.663 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.664 182729 INFO os_vif [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18')
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.726 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.727 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.727 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] No VIF found with MAC fa:16:3e:8f:6c:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:16:37 compute-0 nova_compute[182725]: 2026-01-22 22:16:37.728 182729 INFO nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Using config drive
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.438 182729 INFO nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating config drive at /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.445 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpafza18wd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.473 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.587 182729 DEBUG oslo_concurrency.processutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpafza18wd" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:38 compute-0 kernel: tap3cbb0272-18: entered promiscuous mode
Jan 22 22:16:38 compute-0 NetworkManager[54954]: <info>  [1769120198.6743] manager: (tap3cbb0272-18): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Jan 22 22:16:38 compute-0 ovn_controller[94850]: 2026-01-22T22:16:38Z|00035|binding|INFO|Claiming lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for this chassis.
Jan 22 22:16:38 compute-0 ovn_controller[94850]: 2026-01-22T22:16:38Z|00036|binding|INFO|3cbb0272-18e2-4845-aa69-d6a35ecb0d03: Claiming fa:16:3e:8f:6c:2e 10.100.0.9
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.676 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.684 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.700 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:6c:2e 10.100.0.9'], port_security=['fa:16:3e:8f:6c:2e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=3cbb0272-18e2-4845-aa69-d6a35ecb0d03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.702 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b bound to our chassis
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.704 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:16:38 compute-0 systemd-udevd[212107]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.718 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[75d9e1e3-ec8a-4fbd-8f57-d6ed06b0936d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.719 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0265f228-41 in ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.720 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0265f228-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.720 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3d93de66-e920-4700-af95-48dc52214ee7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.722 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b49f4a-12b4-4ac2-9bbb-e4b088cd4234]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 systemd-machined[154006]: New machine qemu-3-instance-00000007.
Jan 22 22:16:38 compute-0 NetworkManager[54954]: <info>  [1769120198.7338] device (tap3cbb0272-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:16:38 compute-0 NetworkManager[54954]: <info>  [1769120198.7353] device (tap3cbb0272-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.746 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[86d3ef3f-eeaf-49ea-81f6-4615e1e7110e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.755 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:38 compute-0 ovn_controller[94850]: 2026-01-22T22:16:38Z|00037|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 ovn-installed in OVS
Jan 22 22:16:38 compute-0 ovn_controller[94850]: 2026-01-22T22:16:38Z|00038|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 up in Southbound
Jan 22 22:16:38 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Jan 22 22:16:38 compute-0 nova_compute[182725]: 2026-01-22 22:16:38.759 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.778 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6d07d255-cd28-4f07-8a7a-e971d6ca37b4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.812 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8afb1b47-5e53-4f61-8910-96285c1afa5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.818 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[18eba13b-2949-452f-9935-c9a98e2d3df5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 NetworkManager[54954]: <info>  [1769120198.8203] manager: (tap0265f228-40): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.852 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a6e27a-6441-4a65-b5ce-dd5fad088717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.856 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[34bf6799-6863-4c08-9358-bda19a8a2bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 NetworkManager[54954]: <info>  [1769120198.8824] device (tap0265f228-40): carrier: link connected
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.889 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8faa01e7-b3a0-4b8f-adc2-960d62155949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.913 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[54c1789e-e62a-4a7f-8d4a-77c9bfc7855c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383648, 'reachable_time': 33555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212141, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.930 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2ead0d64-3c01-4d9c-bae2-0b5c3da0d1ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:8003'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383648, 'tstamp': 383648}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212142, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.947 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[872d2d47-8d15-4948-8063-a8f0ff48451c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383648, 'reachable_time': 33555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212143, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:38.989 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6731c8c6-999b-45c6-abc9-6c3ce4bc27dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.043 182729 DEBUG nova.compute.manager [req-77ce39d3-177e-400f-9b7a-cbd45e3af5e6 req-2aa55a1a-1a94-4890-88d6-2208794129f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.043 182729 DEBUG oslo_concurrency.lockutils [req-77ce39d3-177e-400f-9b7a-cbd45e3af5e6 req-2aa55a1a-1a94-4890-88d6-2208794129f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.044 182729 DEBUG oslo_concurrency.lockutils [req-77ce39d3-177e-400f-9b7a-cbd45e3af5e6 req-2aa55a1a-1a94-4890-88d6-2208794129f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.044 182729 DEBUG oslo_concurrency.lockutils [req-77ce39d3-177e-400f-9b7a-cbd45e3af5e6 req-2aa55a1a-1a94-4890-88d6-2208794129f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.044 182729 DEBUG nova.compute.manager [req-77ce39d3-177e-400f-9b7a-cbd45e3af5e6 req-2aa55a1a-1a94-4890-88d6-2208794129f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Processing event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.067 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf0e670-2a1a-4404-a48a-eee1eb2ab33f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.070 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.071 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.071 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0265f228-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.074 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:39 compute-0 NetworkManager[54954]: <info>  [1769120199.0748] manager: (tap0265f228-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 22 22:16:39 compute-0 kernel: tap0265f228-40: entered promiscuous mode
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.080 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0265f228-40, col_values=(('external_ids', {'iface-id': '7a6b2843-0304-440c-ac2a-e8d7f0e704c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:39 compute-0 ovn_controller[94850]: 2026-01-22T22:16:39Z|00039|binding|INFO|Releasing lport 7a6b2843-0304-440c-ac2a-e8d7f0e704c0 from this chassis (sb_readonly=0)
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.083 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.106 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.107 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a779e182-b213-4de7-a0ac-4e8bed5fc171]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.107 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:16:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:39.108 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'env', 'PROCESS_TAG=haproxy-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0265f228-4e11-4f15-8d77-6acb409f3f7b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.203 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.204 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120199.2020948, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.204 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Started (Lifecycle Event)
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.210 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.214 182729 INFO nova.virt.libvirt.driver [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Instance spawned successfully.
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.214 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.230 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.238 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.241 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.241 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.249 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.249 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.250 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.250 182729 DEBUG nova.virt.libvirt.driver [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.257 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.257 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120199.2024577, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.257 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Paused (Lifecycle Event)
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.296 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.300 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120199.210075, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.300 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Resumed (Lifecycle Event)
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.326 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.330 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.341 182729 INFO nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Took 6.05 seconds to spawn the instance on the hypervisor.
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.342 182729 DEBUG nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.369 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.430 182729 INFO nova.compute.manager [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Took 6.55 seconds to build instance.
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.456 182729 DEBUG oslo_concurrency.lockutils [None req-8a7d40f4-13d9-4c59-ac7f-af2b792cc0b6 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.490 182729 DEBUG nova.network.neutron [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updated VIF entry in instance network info cache for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.490 182729 DEBUG nova.network.neutron [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:16:39 compute-0 nova_compute[182725]: 2026-01-22 22:16:39.504 182729 DEBUG oslo_concurrency.lockutils [req-9c392198-c92b-4e67-b4b0-75a53300f537 req-83df0858-3ca6-4424-9082-31e773b403dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:16:39 compute-0 podman[212182]: 2026-01-22 22:16:39.525171628 +0000 UTC m=+0.059698313 container create f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 22:16:39 compute-0 systemd[1]: Started libpod-conmon-f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b.scope.
Jan 22 22:16:39 compute-0 podman[212182]: 2026-01-22 22:16:39.492312701 +0000 UTC m=+0.026839436 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:16:39 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:16:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6337d790b4058f25fb6f8b98c67604f758028d1c1e4b7c0bb5a397a05fe0215/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:16:39 compute-0 podman[212182]: 2026-01-22 22:16:39.628369513 +0000 UTC m=+0.162896198 container init f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 22:16:39 compute-0 podman[212182]: 2026-01-22 22:16:39.638078257 +0000 UTC m=+0.172604912 container start f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 22:16:39 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212198]: [NOTICE]   (212202) : New worker (212204) forked
Jan 22 22:16:39 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212198]: [NOTICE]   (212202) : Loading success.
Jan 22 22:16:41 compute-0 nova_compute[182725]: 2026-01-22 22:16:41.127 182729 DEBUG nova.compute.manager [req-7d5f82a7-fdf1-40eb-9c59-9096da7edd7f req-ff5e8f8c-f25a-45c9-824b-cb0a5404ee02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:41 compute-0 nova_compute[182725]: 2026-01-22 22:16:41.128 182729 DEBUG oslo_concurrency.lockutils [req-7d5f82a7-fdf1-40eb-9c59-9096da7edd7f req-ff5e8f8c-f25a-45c9-824b-cb0a5404ee02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:41 compute-0 nova_compute[182725]: 2026-01-22 22:16:41.128 182729 DEBUG oslo_concurrency.lockutils [req-7d5f82a7-fdf1-40eb-9c59-9096da7edd7f req-ff5e8f8c-f25a-45c9-824b-cb0a5404ee02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:41 compute-0 nova_compute[182725]: 2026-01-22 22:16:41.128 182729 DEBUG oslo_concurrency.lockutils [req-7d5f82a7-fdf1-40eb-9c59-9096da7edd7f req-ff5e8f8c-f25a-45c9-824b-cb0a5404ee02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:41 compute-0 nova_compute[182725]: 2026-01-22 22:16:41.128 182729 DEBUG nova.compute.manager [req-7d5f82a7-fdf1-40eb-9c59-9096da7edd7f req-ff5e8f8c-f25a-45c9-824b-cb0a5404ee02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:41 compute-0 nova_compute[182725]: 2026-01-22 22:16:41.128 182729 WARNING nova.compute.manager [req-7d5f82a7-fdf1-40eb-9c59-9096da7edd7f req-ff5e8f8c-f25a-45c9-824b-cb0a5404ee02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state None.
Jan 22 22:16:42 compute-0 nova_compute[182725]: 2026-01-22 22:16:42.654 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:43 compute-0 nova_compute[182725]: 2026-01-22 22:16:43.450 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:44 compute-0 nova_compute[182725]: 2026-01-22 22:16:44.785 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Check if temp file /var/lib/nova/instances/tmp02xgsd7v exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 22 22:16:44 compute-0 nova_compute[182725]: 2026-01-22 22:16:44.788 182729 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp02xgsd7v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.461 182729 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.560 182729 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.562 182729 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.666 182729 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.668 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.668 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.679 182729 INFO nova.compute.rpcapi [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Jan 22 22:16:46 compute-0 nova_compute[182725]: 2026-01-22 22:16:46.680 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:16:47 compute-0 nova_compute[182725]: 2026-01-22 22:16:47.656 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:48 compute-0 podman[212219]: 2026-01-22 22:16:48.15749042 +0000 UTC m=+0.082148857 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:16:48 compute-0 nova_compute[182725]: 2026-01-22 22:16:48.452 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:48 compute-0 sshd-session[212240]: Accepted publickey for nova from 192.168.122.102 port 45630 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:16:48 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 22:16:48 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 22:16:48 compute-0 systemd-logind[801]: New session 26 of user nova.
Jan 22 22:16:48 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 22:16:48 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 22:16:48 compute-0 systemd[212244]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:16:49 compute-0 systemd[212244]: Queued start job for default target Main User Target.
Jan 22 22:16:49 compute-0 systemd[212244]: Created slice User Application Slice.
Jan 22 22:16:49 compute-0 systemd[212244]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:16:49 compute-0 systemd[212244]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:16:49 compute-0 systemd[212244]: Reached target Paths.
Jan 22 22:16:49 compute-0 systemd[212244]: Reached target Timers.
Jan 22 22:16:49 compute-0 systemd[212244]: Starting D-Bus User Message Bus Socket...
Jan 22 22:16:49 compute-0 systemd[212244]: Starting Create User's Volatile Files and Directories...
Jan 22 22:16:49 compute-0 systemd[212244]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:16:49 compute-0 systemd[212244]: Reached target Sockets.
Jan 22 22:16:49 compute-0 systemd[212244]: Finished Create User's Volatile Files and Directories.
Jan 22 22:16:49 compute-0 systemd[212244]: Reached target Basic System.
Jan 22 22:16:49 compute-0 systemd[212244]: Reached target Main User Target.
Jan 22 22:16:49 compute-0 systemd[212244]: Startup finished in 164ms.
Jan 22 22:16:49 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 22:16:49 compute-0 systemd[1]: Started Session 26 of User nova.
Jan 22 22:16:49 compute-0 sshd-session[212240]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:16:49 compute-0 sshd-session[212258]: Received disconnect from 192.168.122.102 port 45630:11: disconnected by user
Jan 22 22:16:49 compute-0 sshd-session[212258]: Disconnected from user nova 192.168.122.102 port 45630
Jan 22 22:16:49 compute-0 sshd-session[212240]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:16:49 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 22 22:16:49 compute-0 systemd-logind[801]: Session 26 logged out. Waiting for processes to exit.
Jan 22 22:16:49 compute-0 systemd-logind[801]: Removed session 26.
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.342 182729 DEBUG nova.compute.manager [req-33aebcf3-2e0a-4673-87a2-92541ba6b1dd req-521f8ee2-790c-4012-b814-32d262ec80c2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.343 182729 DEBUG oslo_concurrency.lockutils [req-33aebcf3-2e0a-4673-87a2-92541ba6b1dd req-521f8ee2-790c-4012-b814-32d262ec80c2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.343 182729 DEBUG oslo_concurrency.lockutils [req-33aebcf3-2e0a-4673-87a2-92541ba6b1dd req-521f8ee2-790c-4012-b814-32d262ec80c2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.344 182729 DEBUG oslo_concurrency.lockutils [req-33aebcf3-2e0a-4673-87a2-92541ba6b1dd req-521f8ee2-790c-4012-b814-32d262ec80c2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.344 182729 DEBUG nova.compute.manager [req-33aebcf3-2e0a-4673-87a2-92541ba6b1dd req-521f8ee2-790c-4012-b814-32d262ec80c2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.344 182729 DEBUG nova.compute.manager [req-33aebcf3-2e0a-4673-87a2-92541ba6b1dd req-521f8ee2-790c-4012-b814-32d262ec80c2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.747 182729 INFO nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Took 4.08 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.748 182729 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.766 182729 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp02xgsd7v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9929aa59-9abd-47ee-a61a-7de35c63e2fa),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.794 182729 DEBUG nova.objects.instance [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'migration_context' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.796 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.798 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.799 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.816 182729 DEBUG nova.virt.libvirt.vif [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:16:39Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.816 182729 DEBUG nova.network.os_vif_util [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.818 182729 DEBUG nova.network.os_vif_util [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.819 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating guest XML with vif config: <interface type="ethernet">
Jan 22 22:16:50 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:8f:6c:2e"/>
Jan 22 22:16:50 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:16:50 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:16:50 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:16:50 compute-0 nova_compute[182725]:   <target dev="tap3cbb0272-18"/>
Jan 22 22:16:50 compute-0 nova_compute[182725]: </interface>
Jan 22 22:16:50 compute-0 nova_compute[182725]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 22 22:16:50 compute-0 nova_compute[182725]: 2026-01-22 22:16:50.820 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 22 22:16:51 compute-0 nova_compute[182725]: 2026-01-22 22:16:51.303 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:16:51 compute-0 nova_compute[182725]: 2026-01-22 22:16:51.305 182729 INFO nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 22 22:16:51 compute-0 nova_compute[182725]: 2026-01-22 22:16:51.418 182729 INFO nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 22 22:16:51 compute-0 nova_compute[182725]: 2026-01-22 22:16:51.922 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:16:51 compute-0 nova_compute[182725]: 2026-01-22 22:16:51.923 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:16:52 compute-0 ovn_controller[94850]: 2026-01-22T22:16:52Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:6c:2e 10.100.0.9
Jan 22 22:16:52 compute-0 ovn_controller[94850]: 2026-01-22T22:16:52Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:6c:2e 10.100.0.9
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.428 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.429 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.463 182729 DEBUG nova.compute.manager [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.463 182729 DEBUG oslo_concurrency.lockutils [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.464 182729 DEBUG oslo_concurrency.lockutils [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.464 182729 DEBUG oslo_concurrency.lockutils [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.464 182729 DEBUG nova.compute.manager [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.465 182729 WARNING nova.compute.manager [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.465 182729 DEBUG nova.compute.manager [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-changed-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.465 182729 DEBUG nova.compute.manager [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Refreshing instance network info cache due to event network-changed-3cbb0272-18e2-4845-aa69-d6a35ecb0d03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.466 182729 DEBUG oslo_concurrency.lockutils [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.466 182729 DEBUG oslo_concurrency.lockutils [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.466 182729 DEBUG nova.network.neutron [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Refreshing network info cache for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.659 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.926 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120212.9256186, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.926 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Paused (Lifecycle Event)
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.953 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.954 182729 DEBUG nova.virt.libvirt.migration [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.959 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.965 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:16:52 compute-0 nova_compute[182725]: 2026-01-22 22:16:52.989 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 22 22:16:53 compute-0 kernel: tap3cbb0272-18 (unregistering): left promiscuous mode
Jan 22 22:16:53 compute-0 NetworkManager[54954]: <info>  [1769120213.1123] device (tap3cbb0272-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:16:53 compute-0 ovn_controller[94850]: 2026-01-22T22:16:53Z|00040|binding|INFO|Releasing lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 from this chassis (sb_readonly=0)
Jan 22 22:16:53 compute-0 ovn_controller[94850]: 2026-01-22T22:16:53Z|00041|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 down in Southbound
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.121 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 ovn_controller[94850]: 2026-01-22T22:16:53Z|00042|binding|INFO|Removing iface tap3cbb0272-18 ovn-installed in OVS
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.124 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.129 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:6c:2e 10.100.0.9'], port_security=['fa:16:3e:8f:6c:2e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e130c2ec-fef7-4ed2-892d-1e3d7eaab401'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=3cbb0272-18e2-4845-aa69-d6a35ecb0d03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.130 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b unbound from our chassis
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.132 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0265f228-4e11-4f15-8d77-6acb409f3f7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.133 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8e15a5-cbd0-4958-8bf9-4714f483ec08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.134 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace which is not needed anymore
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.155 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 22 22:16:53 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 13.528s CPU time.
Jan 22 22:16:53 compute-0 systemd-machined[154006]: Machine qemu-3-instance-00000007 terminated.
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.311 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.318 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212198]: [NOTICE]   (212202) : haproxy version is 2.8.14-c23fe91
Jan 22 22:16:53 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212198]: [NOTICE]   (212202) : path to executable is /usr/sbin/haproxy
Jan 22 22:16:53 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212198]: [WARNING]  (212202) : Exiting Master process...
Jan 22 22:16:53 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212198]: [ALERT]    (212202) : Current worker (212204) exited with code 143 (Terminated)
Jan 22 22:16:53 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212198]: [WARNING]  (212202) : All workers exited. Exiting... (0)
Jan 22 22:16:53 compute-0 systemd[1]: libpod-f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b.scope: Deactivated successfully.
Jan 22 22:16:53 compute-0 podman[212302]: 2026-01-22 22:16:53.344385425 +0000 UTC m=+0.069751225 container died f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.364 182729 DEBUG nova.compute.manager [req-3093e90b-ec46-4b27-bfa2-46f80eb0d7b5 req-2fe424ec-78f2-488f-9c45-b3179a575f5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.365 182729 DEBUG oslo_concurrency.lockutils [req-3093e90b-ec46-4b27-bfa2-46f80eb0d7b5 req-2fe424ec-78f2-488f-9c45-b3179a575f5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.365 182729 DEBUG oslo_concurrency.lockutils [req-3093e90b-ec46-4b27-bfa2-46f80eb0d7b5 req-2fe424ec-78f2-488f-9c45-b3179a575f5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.366 182729 DEBUG oslo_concurrency.lockutils [req-3093e90b-ec46-4b27-bfa2-46f80eb0d7b5 req-2fe424ec-78f2-488f-9c45-b3179a575f5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.366 182729 DEBUG nova.compute.manager [req-3093e90b-ec46-4b27-bfa2-46f80eb0d7b5 req-2fe424ec-78f2-488f-9c45-b3179a575f5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.367 182729 DEBUG nova.compute.manager [req-3093e90b-ec46-4b27-bfa2-46f80eb0d7b5 req-2fe424ec-78f2-488f-9c45-b3179a575f5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.368 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.368 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.369 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 22 22:16:53 compute-0 podman[212297]: 2026-01-22 22:16:53.385405336 +0000 UTC m=+0.125282591 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Jan 22 22:16:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-f6337d790b4058f25fb6f8b98c67604f758028d1c1e4b7c0bb5a397a05fe0215-merged.mount: Deactivated successfully.
Jan 22 22:16:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b-userdata-shm.mount: Deactivated successfully.
Jan 22 22:16:53 compute-0 podman[212300]: 2026-01-22 22:16:53.389674194 +0000 UTC m=+0.134914304 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true)
Jan 22 22:16:53 compute-0 podman[212302]: 2026-01-22 22:16:53.399320336 +0000 UTC m=+0.124686136 container cleanup f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:16:53 compute-0 systemd[1]: libpod-conmon-f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b.scope: Deactivated successfully.
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.453 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.457 182729 DEBUG nova.virt.libvirt.guest [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '1c2458ea-22d6-480f-ae75-5f050eb08b2b' (instance-00000007) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.460 182729 INFO nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration operation has completed
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.460 182729 INFO nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] _post_live_migration() is started..
Jan 22 22:16:53 compute-0 podman[212386]: 2026-01-22 22:16:53.46666413 +0000 UTC m=+0.041720600 container remove f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.474 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[49a9d1ea-d4ce-41bd-95cb-f17e7be21e2e]: (4, ('Thu Jan 22 10:16:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b)\nf62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b\nThu Jan 22 10:16:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (f62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b)\nf62cfde9f16b14383f34511f63b128e6ede54f855ba5ff1e0fe9748b5a2dc18b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.476 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[edb0a77e-ddc6-4ac7-bf5d-bbfbdd3de938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.477 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:53 compute-0 kernel: tap0265f228-40: left promiscuous mode
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.488 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 nova_compute[182725]: 2026-01-22 22:16:53.499 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.500 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[54919399-5604-4c87-9f87-ae6da74aa48c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.518 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fdece9-1beb-4061-a180-998f985e6df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.519 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e02507fa-c9d0-4b4a-8721-27421f562c98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.540 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[68d91976-7150-4661-8745-fc0ac09662ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383640, 'reachable_time': 19990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212405, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.545 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:16:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d0265f228\x2d4e11\x2d4f15\x2d8d77\x2d6acb409f3f7b.mount: Deactivated successfully.
Jan 22 22:16:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:16:53.546 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[b41cb227-c596-4463-a8dd-4fc7ffdb851a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.156 182729 DEBUG nova.network.neutron [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updated VIF entry in instance network info cache for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.157 182729 DEBUG nova.network.neutron [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.175 182729 DEBUG oslo_concurrency.lockutils [req-d1c21ce9-d33a-43e8-89e4-5218d276363c req-828d53c7-9ccb-46d4-909a-72a335a3d3f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.392 182729 DEBUG nova.network.neutron [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Activated binding for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.393 182729 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.393 182729 DEBUG nova.virt.libvirt.vif [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:16:43Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.394 182729 DEBUG nova.network.os_vif_util [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.394 182729 DEBUG nova.network.os_vif_util [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.394 182729 DEBUG os_vif [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.396 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.396 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3cbb0272-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.398 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.399 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.402 182729 INFO os_vif [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18')
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.402 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.402 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.402 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.403 182729 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.403 182729 INFO nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Deleting instance files /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b_del
Jan 22 22:16:54 compute-0 nova_compute[182725]: 2026-01-22 22:16:54.404 182729 INFO nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Deletion of /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b_del complete
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.461 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.461 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.462 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.462 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.463 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.463 182729 WARNING nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.464 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.464 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.465 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.465 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.465 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.466 182729 WARNING nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.467 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.467 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.467 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.468 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.468 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.469 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.469 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.470 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.470 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.471 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.471 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.471 182729 WARNING nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.472 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.472 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.473 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.473 182729 DEBUG oslo_concurrency.lockutils [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.474 182729 DEBUG nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:16:55 compute-0 nova_compute[182725]: 2026-01-22 22:16:55.474 182729 WARNING nova.compute.manager [req-66e49d3b-78f5-4dc1-98b6-8aca3c018c10 req-851772f9-b7b6-4549-b7d2-31672c8709b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.
Jan 22 22:16:58 compute-0 nova_compute[182725]: 2026-01-22 22:16:58.455 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.173 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.174 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.174 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.200 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.201 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.201 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.202 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:16:59 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 22:16:59 compute-0 systemd[212244]: Activating special unit Exit the Session...
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped target Main User Target.
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped target Basic System.
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped target Paths.
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped target Sockets.
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped target Timers.
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:16:59 compute-0 systemd[212244]: Closed D-Bus User Message Bus Socket.
Jan 22 22:16:59 compute-0 systemd[212244]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:16:59 compute-0 systemd[212244]: Removed slice User Application Slice.
Jan 22 22:16:59 compute-0 systemd[212244]: Reached target Shutdown.
Jan 22 22:16:59 compute-0 systemd[212244]: Finished Exit the Session.
Jan 22 22:16:59 compute-0 systemd[212244]: Reached target Exit the Session.
Jan 22 22:16:59 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 22:16:59 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 22:16:59 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 22:16:59 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 22:16:59 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 22:16:59 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 22:16:59 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.400 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.418 182729 WARNING nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.420 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5735MB free_disk=73.38159942626953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.420 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.420 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.463 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Migration for instance 1c2458ea-22d6-480f-ae75-5f050eb08b2b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.506 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.541 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Migration 9929aa59-9abd-47ee-a61a-7de35c63e2fa is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.542 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.542 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.596 182729 DEBUG nova.compute.provider_tree [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.621 182729 DEBUG nova.scheduler.client.report [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.643 182729 DEBUG nova.compute.resource_tracker [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.644 182729 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.660 182729 INFO nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.734 182729 INFO nova.scheduler.client.report [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Deleted allocation for migration 9929aa59-9abd-47ee-a61a-7de35c63e2fa
Jan 22 22:16:59 compute-0 nova_compute[182725]: 2026-01-22 22:16:59.735 182729 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 22 22:17:02 compute-0 podman[212410]: 2026-01-22 22:17:02.130962655 +0000 UTC m=+0.062198295 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:17:02 compute-0 podman[212409]: 2026-01-22 22:17:02.151976914 +0000 UTC m=+0.077105501 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 22:17:02 compute-0 nova_compute[182725]: 2026-01-22 22:17:02.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:02 compute-0 nova_compute[182725]: 2026-01-22 22:17:02.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:17:02 compute-0 nova_compute[182725]: 2026-01-22 22:17:02.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:17:02 compute-0 nova_compute[182725]: 2026-01-22 22:17:02.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:17:03 compute-0 nova_compute[182725]: 2026-01-22 22:17:03.457 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:03 compute-0 nova_compute[182725]: 2026-01-22 22:17:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:03 compute-0 nova_compute[182725]: 2026-01-22 22:17:03.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:03 compute-0 nova_compute[182725]: 2026-01-22 22:17:03.909 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:04 compute-0 nova_compute[182725]: 2026-01-22 22:17:04.225 182729 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating tmpfile /var/lib/nova/instances/tmpjb5ww5kl to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 22 22:17:04 compute-0 nova_compute[182725]: 2026-01-22 22:17:04.403 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:04 compute-0 nova_compute[182725]: 2026-01-22 22:17:04.407 182729 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjb5ww5kl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 22 22:17:04 compute-0 nova_compute[182725]: 2026-01-22 22:17:04.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.749 182729 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjb5ww5kl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.786 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.787 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.787 182729 DEBUG nova.network.neutron [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:05 compute-0 nova_compute[182725]: 2026-01-22 22:17:05.915 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.145 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.146 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5759MB free_disk=73.38159942626953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.146 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.146 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.316 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Migration for instance 1c2458ea-22d6-480f-ae75-5f050eb08b2b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.342 182729 INFO nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating resource usage from migration 212b821b-cc9d-4094-bb7b-b23ad6071dc7
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.342 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Starting to track incoming migration 212b821b-cc9d-4094-bb7b-b23ad6071dc7 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.389 182729 WARNING nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 1c2458ea-22d6-480f-ae75-5f050eb08b2b has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.390 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.390 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.444 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.456 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.477 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:17:06 compute-0 nova_compute[182725]: 2026-01-22 22:17:06.478 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:07 compute-0 nova_compute[182725]: 2026-01-22 22:17:07.952 182729 DEBUG nova.network.neutron [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:17:07 compute-0 nova_compute[182725]: 2026-01-22 22:17:07.967 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:17:07 compute-0 nova_compute[182725]: 2026-01-22 22:17:07.983 182729 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjb5ww5kl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 22 22:17:07 compute-0 nova_compute[182725]: 2026-01-22 22:17:07.983 182729 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating instance directory: /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 22 22:17:07 compute-0 nova_compute[182725]: 2026-01-22 22:17:07.984 182729 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating disk.info with the contents: {'/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk': 'qcow2', '/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 22 22:17:07 compute-0 nova_compute[182725]: 2026-01-22 22:17:07.985 182729 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 22 22:17:07 compute-0 nova_compute[182725]: 2026-01-22 22:17:07.986 182729 DEBUG nova.objects.instance [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.022 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.129 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.131 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.132 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.157 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:08 compute-0 podman[212452]: 2026-01-22 22:17:08.16209784 +0000 UTC m=+0.082494455 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.226 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.229 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.272 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.274 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.275 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.361 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.363 182729 DEBUG nova.virt.disk.api [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Checking if we can resize image /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.364 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.385 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120213.366797, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.386 182729 INFO nova.compute.manager [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Stopped (Lifecycle Event)
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.410 182729 DEBUG nova.compute.manager [None req-5dec2c68-8e8b-4991-b37b-90a6e9743c6b - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.427 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.429 182729 DEBUG nova.virt.disk.api [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Cannot resize image /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.430 182729 DEBUG nova.objects.instance [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'migration_context' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.442 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.469 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.478 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.480 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.480 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.485 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config 485376" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.487 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config to /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.488 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.986 182729 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.987 182729 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.990 182729 DEBUG nova.virt.libvirt.vif [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:16:58Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.991 182729 DEBUG nova.network.os_vif_util [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.992 182729 DEBUG nova.network.os_vif_util [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.993 182729 DEBUG os_vif [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.994 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.995 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.996 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:08 compute-0 nova_compute[182725]: 2026-01-22 22:17:08.999 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.000 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3cbb0272-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.000 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3cbb0272-18, col_values=(('external_ids', {'iface-id': '3cbb0272-18e2-4845-aa69-d6a35ecb0d03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:6c:2e', 'vm-uuid': '1c2458ea-22d6-480f-ae75-5f050eb08b2b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.003 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:09 compute-0 NetworkManager[54954]: <info>  [1769120229.0041] manager: (tap3cbb0272-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.007 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.009 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.011 182729 INFO os_vif [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18')
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.011 182729 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 22 22:17:09 compute-0 nova_compute[182725]: 2026-01-22 22:17:09.012 182729 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjb5ww5kl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 22 22:17:11 compute-0 nova_compute[182725]: 2026-01-22 22:17:11.284 182729 DEBUG nova.network.neutron [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 22 22:17:11 compute-0 nova_compute[182725]: 2026-01-22 22:17:11.746 182729 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjb5ww5kl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 22 22:17:11 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 22 22:17:11 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 22 22:17:12 compute-0 kernel: tap3cbb0272-18: entered promiscuous mode
Jan 22 22:17:12 compute-0 NetworkManager[54954]: <info>  [1769120232.1615] manager: (tap3cbb0272-18): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Jan 22 22:17:12 compute-0 nova_compute[182725]: 2026-01-22 22:17:12.160 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:12 compute-0 ovn_controller[94850]: 2026-01-22T22:17:12Z|00043|binding|INFO|Claiming lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for this additional chassis.
Jan 22 22:17:12 compute-0 ovn_controller[94850]: 2026-01-22T22:17:12Z|00044|binding|INFO|3cbb0272-18e2-4845-aa69-d6a35ecb0d03: Claiming fa:16:3e:8f:6c:2e 10.100.0.9
Jan 22 22:17:12 compute-0 nova_compute[182725]: 2026-01-22 22:17:12.166 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:12 compute-0 ovn_controller[94850]: 2026-01-22T22:17:12Z|00045|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 ovn-installed in OVS
Jan 22 22:17:12 compute-0 nova_compute[182725]: 2026-01-22 22:17:12.180 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:12 compute-0 nova_compute[182725]: 2026-01-22 22:17:12.182 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:12 compute-0 systemd-udevd[212529]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:17:12 compute-0 systemd-machined[154006]: New machine qemu-4-instance-00000007.
Jan 22 22:17:12 compute-0 NetworkManager[54954]: <info>  [1769120232.2131] device (tap3cbb0272-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:17:12 compute-0 NetworkManager[54954]: <info>  [1769120232.2138] device (tap3cbb0272-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:17:12 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Jan 22 22:17:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:12.424 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:12.425 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:12.425 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:12 compute-0 nova_compute[182725]: 2026-01-22 22:17:12.948 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120232.9479287, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:12 compute-0 nova_compute[182725]: 2026-01-22 22:17:12.950 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Started (Lifecycle Event)
Jan 22 22:17:12 compute-0 nova_compute[182725]: 2026-01-22 22:17:12.986 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:13 compute-0 nova_compute[182725]: 2026-01-22 22:17:13.460 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:13 compute-0 nova_compute[182725]: 2026-01-22 22:17:13.812 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120233.8126287, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:13 compute-0 nova_compute[182725]: 2026-01-22 22:17:13.813 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Resumed (Lifecycle Event)
Jan 22 22:17:13 compute-0 nova_compute[182725]: 2026-01-22 22:17:13.843 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:13 compute-0 nova_compute[182725]: 2026-01-22 22:17:13.847 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:17:13 compute-0 nova_compute[182725]: 2026-01-22 22:17:13.865 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 22 22:17:14 compute-0 nova_compute[182725]: 2026-01-22 22:17:14.003 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:14 compute-0 ovn_controller[94850]: 2026-01-22T22:17:14Z|00046|binding|INFO|Claiming lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for this chassis.
Jan 22 22:17:14 compute-0 ovn_controller[94850]: 2026-01-22T22:17:14Z|00047|binding|INFO|3cbb0272-18e2-4845-aa69-d6a35ecb0d03: Claiming fa:16:3e:8f:6c:2e 10.100.0.9
Jan 22 22:17:14 compute-0 ovn_controller[94850]: 2026-01-22T22:17:14Z|00048|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 up in Southbound
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.847 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.848 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:17:14 compute-0 nova_compute[182725]: 2026-01-22 22:17:14.848 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.850 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:6c:2e 10.100.0.9'], port_security=['fa:16:3e:8f:6c:2e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '21', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=3cbb0272-18e2-4845-aa69-d6a35ecb0d03) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.851 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b bound to our chassis
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.853 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.870 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[be149a74-4011-4402-ab9d-e661650fb321]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.871 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0265f228-41 in ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.874 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0265f228-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.874 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8984caf3-a5fe-42f6-b409-7b733ccb2397]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.875 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9637be-aaa7-44ff-9356-39faff64992f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.892 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f782a7-1335-4c5e-9abe-e9a32df35cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.922 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c33e65ca-aeff-4074-87ed-480195479fe4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.952 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f23f3606-b295-4c0e-b118-1df2e7edf925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:14.961 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb2768f-302e-4726-9da8-5ab41f3975d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:14 compute-0 systemd-udevd[212533]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:17:14 compute-0 NetworkManager[54954]: <info>  [1769120234.9636] manager: (tap0265f228-40): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.004 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[35f55003-f936-4d47-a66c-658932a05b92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.008 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6416b2bf-f191-4b9b-bb2f-62100d05276e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 NetworkManager[54954]: <info>  [1769120235.0426] device (tap0265f228-40): carrier: link connected
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.043 182729 INFO nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Post operation of migration started
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.050 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[abba8efe-cb24-4602-b773-fd0d1ddc6ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.076 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[98b3273f-5835-410f-ae2f-e9032bf703f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387264, 'reachable_time': 35108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212586, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.100 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1d536e-4ed5-4fc2-a3d0-aa6e47fc53e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:8003'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387264, 'tstamp': 387264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212587, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.127 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[35aa6a47-d72b-424b-8fe6-7e08e1e8918c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387264, 'reachable_time': 35108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212588, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.174 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3101ad-f0ee-496d-99cc-233324e07b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.270 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f884d9e3-21c1-486a-a4b3-c23ef6e51a05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.272 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.273 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.273 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0265f228-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.276 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:15 compute-0 NetworkManager[54954]: <info>  [1769120235.2773] manager: (tap0265f228-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 22 22:17:15 compute-0 kernel: tap0265f228-40: entered promiscuous mode
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.282 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0265f228-40, col_values=(('external_ids', {'iface-id': '7a6b2843-0304-440c-ac2a-e8d7f0e704c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.283 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:15 compute-0 ovn_controller[94850]: 2026-01-22T22:17:15Z|00049|binding|INFO|Releasing lport 7a6b2843-0304-440c-ac2a-e8d7f0e704c0 from this chassis (sb_readonly=0)
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.285 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.286 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.287 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[53b41e3a-97a9-4fb2-8d57-01553df26dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.288 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:17:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:15.289 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'env', 'PROCESS_TAG=haproxy-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0265f228-4e11-4f15-8d77-6acb409f3f7b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.306 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.443 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.444 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:17:15 compute-0 nova_compute[182725]: 2026-01-22 22:17:15.444 182729 DEBUG nova.network.neutron [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:17:15 compute-0 podman[212621]: 2026-01-22 22:17:15.717559921 +0000 UTC m=+0.066407111 container create bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:17:15 compute-0 podman[212621]: 2026-01-22 22:17:15.680722564 +0000 UTC m=+0.029569774 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:17:15 compute-0 systemd[1]: Started libpod-conmon-bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08.scope.
Jan 22 22:17:15 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:17:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529a7993d85d92e3d3d3bca4c8bf362180b6b9c90e484cf8fe467e0d8afda0f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:17:15 compute-0 podman[212621]: 2026-01-22 22:17:15.83444389 +0000 UTC m=+0.183291100 container init bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:17:15 compute-0 podman[212621]: 2026-01-22 22:17:15.845671623 +0000 UTC m=+0.194518803 container start bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:17:15 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212636]: [NOTICE]   (212640) : New worker (212642) forked
Jan 22 22:17:15 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212636]: [NOTICE]   (212640) : Loading success.
Jan 22 22:17:17 compute-0 nova_compute[182725]: 2026-01-22 22:17:17.658 182729 DEBUG nova.network.neutron [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:17:17 compute-0 nova_compute[182725]: 2026-01-22 22:17:17.678 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:17:17 compute-0 nova_compute[182725]: 2026-01-22 22:17:17.713 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:17 compute-0 nova_compute[182725]: 2026-01-22 22:17:17.714 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:17 compute-0 nova_compute[182725]: 2026-01-22 22:17:17.715 182729 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:17 compute-0 nova_compute[182725]: 2026-01-22 22:17:17.721 182729 INFO nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 22 22:17:17 compute-0 virtqemud[182297]: Domain id=4 name='instance-00000007' uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b is tainted: custom-monitor
Jan 22 22:17:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:17.851 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:18 compute-0 nova_compute[182725]: 2026-01-22 22:17:18.462 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:18 compute-0 nova_compute[182725]: 2026-01-22 22:17:18.732 182729 INFO nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.007 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.134 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "52297cc9-cee5-40f3-a5db-e330cb26b900" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.135 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "52297cc9-cee5-40f3-a5db-e330cb26b900" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.151 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:17:19 compute-0 podman[212651]: 2026-01-22 22:17:19.170407006 +0000 UTC m=+0.094917068 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.240 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.241 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.250 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.250 182729 INFO nova.compute.claims [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.400 182729 DEBUG nova.compute.provider_tree [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.421 182729 DEBUG nova.scheduler.client.report [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.449 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.450 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.550 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.550 182729 DEBUG nova.network.neutron [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.574 182729 INFO nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.603 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.739 182729 INFO nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.746 182729 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.752 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.754 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.754 182729 INFO nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Creating image(s)
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.755 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "/var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.756 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "/var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.757 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "/var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.788 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.855 182729 DEBUG nova.objects.instance [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.883 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.884 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.885 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:19 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.907 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:19.999 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.001 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.045 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.046 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.047 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.104 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.106 182729 DEBUG nova.virt.disk.api [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Checking if we can resize image /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.106 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.183 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.184 182729 DEBUG nova.virt.disk.api [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Cannot resize image /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.184 182729 DEBUG nova.objects.instance [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lazy-loading 'migration_context' on Instance uuid 52297cc9-cee5-40f3-a5db-e330cb26b900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.195 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.196 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Ensure instance console log exists: /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.196 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.196 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.196 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.661 182729 DEBUG nova.network.neutron [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.662 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.663 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.671 182729 WARNING nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.676 182729 DEBUG nova.virt.libvirt.host [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.676 182729 DEBUG nova.virt.libvirt.host [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.680 182729 DEBUG nova.virt.libvirt.host [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.680 182729 DEBUG nova.virt.libvirt.host [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.682 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.682 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.682 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.682 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.683 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.683 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.683 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.683 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.683 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.683 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.684 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.684 182729 DEBUG nova.virt.hardware [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.687 182729 DEBUG nova.objects.instance [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lazy-loading 'pci_devices' on Instance uuid 52297cc9-cee5-40f3-a5db-e330cb26b900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.701 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <uuid>52297cc9-cee5-40f3-a5db-e330cb26b900</uuid>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <name>instance-0000000a</name>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerDiagnosticsTest-server-364641643</nova:name>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:17:20</nova:creationTime>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:17:20 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:17:20 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:17:20 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:17:20 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:17:20 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:17:20 compute-0 nova_compute[182725]:         <nova:user uuid="cbe26c241aea489599384d497e424bd2">tempest-ServerDiagnosticsTest-54291266-project-member</nova:user>
Jan 22 22:17:20 compute-0 nova_compute[182725]:         <nova:project uuid="e5dd24d96a044dad84a15558da285986">tempest-ServerDiagnosticsTest-54291266</nova:project>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <system>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <entry name="serial">52297cc9-cee5-40f3-a5db-e330cb26b900</entry>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <entry name="uuid">52297cc9-cee5-40f3-a5db-e330cb26b900</entry>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </system>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <os>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   </os>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <features>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   </features>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk.config"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/console.log" append="off"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <video>
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </video>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:17:20 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:17:20 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:17:20 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:17:20 compute-0 nova_compute[182725]: </domain>
Jan 22 22:17:20 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.767 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.767 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.768 182729 INFO nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Using config drive
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.930 182729 INFO nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Creating config drive at /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk.config
Jan 22 22:17:20 compute-0 nova_compute[182725]: 2026-01-22 22:17:20.939 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8pj5olas execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.086 182729 DEBUG oslo_concurrency.processutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8pj5olas" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:21 compute-0 systemd-machined[154006]: New machine qemu-5-instance-0000000a.
Jan 22 22:17:21 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-0000000a.
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.544 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120241.5435116, 52297cc9-cee5-40f3-a5db-e330cb26b900 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.544 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] VM Resumed (Lifecycle Event)
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.548 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.548 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.552 182729 INFO nova.virt.libvirt.driver [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Instance spawned successfully.
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.553 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.575 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.581 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.581 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.582 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.582 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.583 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.583 182729 DEBUG nova.virt.libvirt.driver [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.588 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.614 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.615 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120241.5436957, 52297cc9-cee5-40f3-a5db-e330cb26b900 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.615 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] VM Started (Lifecycle Event)
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.635 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.639 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.661 182729 INFO nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Took 1.91 seconds to spawn the instance on the hypervisor.
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.662 182729 DEBUG nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.667 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.748 182729 INFO nova.compute.manager [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Took 2.54 seconds to build instance.
Jan 22 22:17:21 compute-0 nova_compute[182725]: 2026-01-22 22:17:21.764 182729 DEBUG oslo_concurrency.lockutils [None req-1ae0d7e0-a30c-4b7e-9d4c-6e347cd38863 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "52297cc9-cee5-40f3-a5db-e330cb26b900" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.261 182729 DEBUG nova.compute.manager [None req-770e8d14-9b34-462b-a2dd-c85de595184f a34578a2ae8a40b4b279d413fd27b3af 602e0d0bed0148efa42fb0abe09fe4f3 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.267 182729 INFO nova.compute.manager [None req-770e8d14-9b34-462b-a2dd-c85de595184f a34578a2ae8a40b4b279d413fd27b3af 602e0d0bed0148efa42fb0abe09fe4f3 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Retrieving diagnostics
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.490 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "52297cc9-cee5-40f3-a5db-e330cb26b900" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.491 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "52297cc9-cee5-40f3-a5db-e330cb26b900" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.491 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "52297cc9-cee5-40f3-a5db-e330cb26b900-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.491 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "52297cc9-cee5-40f3-a5db-e330cb26b900-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.492 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "52297cc9-cee5-40f3-a5db-e330cb26b900-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.501 182729 INFO nova.compute.manager [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Terminating instance
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.510 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "refresh_cache-52297cc9-cee5-40f3-a5db-e330cb26b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.510 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquired lock "refresh_cache-52297cc9-cee5-40f3-a5db-e330cb26b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.511 182729 DEBUG nova.network.neutron [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:17:22 compute-0 nova_compute[182725]: 2026-01-22 22:17:22.646 182729 DEBUG nova.network.neutron [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.465 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.646 182729 DEBUG nova.network.neutron [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.673 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Releasing lock "refresh_cache-52297cc9-cee5-40f3-a5db-e330cb26b900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.674 182729 DEBUG nova.compute.manager [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:17:23 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 22 22:17:23 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Consumed 2.493s CPU time.
Jan 22 22:17:23 compute-0 systemd-machined[154006]: Machine qemu-5-instance-0000000a terminated.
Jan 22 22:17:23 compute-0 podman[212716]: 2026-01-22 22:17:23.808339274 +0000 UTC m=+0.080771512 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 22:17:23 compute-0 podman[212715]: 2026-01-22 22:17:23.85627157 +0000 UTC m=+0.124581225 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.920 182729 INFO nova.virt.libvirt.driver [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Instance destroyed successfully.
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.921 182729 DEBUG nova.objects.instance [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lazy-loading 'resources' on Instance uuid 52297cc9-cee5-40f3-a5db-e330cb26b900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.937 182729 INFO nova.virt.libvirt.driver [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Deleting instance files /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900_del
Jan 22 22:17:23 compute-0 nova_compute[182725]: 2026-01-22 22:17:23.938 182729 INFO nova.virt.libvirt.driver [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Deletion of /var/lib/nova/instances/52297cc9-cee5-40f3-a5db-e330cb26b900_del complete
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.009 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.026 182729 INFO nova.compute.manager [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.027 182729 DEBUG oslo.service.loopingcall [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.027 182729 DEBUG nova.compute.manager [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.027 182729 DEBUG nova.network.neutron [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.148 182729 DEBUG nova.network.neutron [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.160 182729 DEBUG nova.network.neutron [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.172 182729 INFO nova.compute.manager [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Took 0.14 seconds to deallocate network for instance.
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.258 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.259 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.332 182729 DEBUG nova.compute.provider_tree [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.366 182729 DEBUG nova.scheduler.client.report [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.408 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.436 182729 INFO nova.scheduler.client.report [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Deleted allocations for instance 52297cc9-cee5-40f3-a5db-e330cb26b900
Jan 22 22:17:24 compute-0 ovn_controller[94850]: 2026-01-22T22:17:24Z|00050|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 22:17:24 compute-0 nova_compute[182725]: 2026-01-22 22:17:24.591 182729 DEBUG oslo_concurrency.lockutils [None req-6dd8c8c7-f312-49ec-90e6-1dfff327a6f3 cbe26c241aea489599384d497e424bd2 e5dd24d96a044dad84a15558da285986 - - default default] Lock "52297cc9-cee5-40f3-a5db-e330cb26b900" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:28 compute-0 nova_compute[182725]: 2026-01-22 22:17:28.468 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:29 compute-0 nova_compute[182725]: 2026-01-22 22:17:29.011 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:33 compute-0 podman[212765]: 2026-01-22 22:17:33.153212596 +0000 UTC m=+0.069509229 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:17:33 compute-0 podman[212766]: 2026-01-22 22:17:33.204879915 +0000 UTC m=+0.113392262 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.470 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.824 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.824 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.846 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.981 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.981 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.988 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:17:33 compute-0 nova_compute[182725]: 2026-01-22 22:17:33.988 182729 INFO nova.compute.claims [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.014 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.131 182729 DEBUG nova.compute.provider_tree [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.145 182729 DEBUG nova.scheduler.client.report [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.176 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.177 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.238 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.239 182729 DEBUG nova.network.neutron [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.260 182729 INFO nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.285 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.428 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.430 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.431 182729 INFO nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Creating image(s)
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.432 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.433 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.434 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.459 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.539 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.540 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.541 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.564 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.625 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.627 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.669 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.670 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.671 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.732 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.734 182729 DEBUG nova.virt.disk.api [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Checking if we can resize image /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.734 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.787 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.788 182729 DEBUG nova.virt.disk.api [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Cannot resize image /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.789 182729 DEBUG nova.objects.instance [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lazy-loading 'migration_context' on Instance uuid eb864a01-1633-42f3-ac5f-4d664cc5d477 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.811 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.812 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Ensure instance console log exists: /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.813 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.814 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.814 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:34 compute-0 nova_compute[182725]: 2026-01-22 22:17:34.864 182729 DEBUG nova.policy [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:17:35 compute-0 nova_compute[182725]: 2026-01-22 22:17:35.935 182729 DEBUG nova.network.neutron [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Successfully updated port: 1d0bf445-f745-430d-9927-a3d8cdc9b6fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:17:35 compute-0 nova_compute[182725]: 2026-01-22 22:17:35.957 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:17:35 compute-0 nova_compute[182725]: 2026-01-22 22:17:35.957 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquired lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:17:35 compute-0 nova_compute[182725]: 2026-01-22 22:17:35.957 182729 DEBUG nova.network.neutron [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.048 182729 DEBUG nova.compute.manager [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-changed-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.048 182729 DEBUG nova.compute.manager [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Refreshing instance network info cache due to event network-changed-1d0bf445-f745-430d-9927-a3d8cdc9b6fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.048 182729 DEBUG oslo_concurrency.lockutils [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.102 182729 DEBUG nova.network.neutron [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.823 182729 DEBUG nova.network.neutron [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updating instance_info_cache with network_info: [{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.857 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Releasing lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.858 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Instance network_info: |[{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.867 182729 DEBUG oslo_concurrency.lockutils [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.868 182729 DEBUG nova.network.neutron [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Refreshing network info cache for port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.874 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Start _get_guest_xml network_info=[{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.883 182729 WARNING nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.897 182729 DEBUG nova.virt.libvirt.host [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.899 182729 DEBUG nova.virt.libvirt.host [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.903 182729 DEBUG nova.virt.libvirt.host [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.904 182729 DEBUG nova.virt.libvirt.host [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.906 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.907 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.908 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.908 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.909 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.909 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.909 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.910 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.910 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.911 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.911 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.912 182729 DEBUG nova.virt.hardware [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.918 182729 DEBUG nova.virt.libvirt.vif [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:17:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1892112726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1892112726',id=14,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-m40501p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:17:34Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=eb864a01-1633-42f3-ac5f-4d664cc5d477,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.919 182729 DEBUG nova.network.os_vif_util [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converting VIF {"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.921 182729 DEBUG nova.network.os_vif_util [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.922 182729 DEBUG nova.objects.instance [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lazy-loading 'pci_devices' on Instance uuid eb864a01-1633-42f3-ac5f-4d664cc5d477 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.937 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <uuid>eb864a01-1633-42f3-ac5f-4d664cc5d477</uuid>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <name>instance-0000000e</name>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1892112726</nova:name>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:17:36</nova:creationTime>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:user uuid="f591d36af603475bbc613d6c93854a42">tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member</nova:user>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:project uuid="4ff5f7f17f1c471986dfd67f5192359f">tempest-LiveAutoBlockMigrationV225Test-1833907945</nova:project>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         <nova:port uuid="1d0bf445-f745-430d-9927-a3d8cdc9b6fc">
Jan 22 22:17:36 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <system>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <entry name="serial">eb864a01-1633-42f3-ac5f-4d664cc5d477</entry>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <entry name="uuid">eb864a01-1633-42f3-ac5f-4d664cc5d477</entry>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </system>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <os>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   </os>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <features>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   </features>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:83:be:3d"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <target dev="tap1d0bf445-f7"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/console.log" append="off"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <video>
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </video>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:17:36 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:17:36 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:17:36 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:17:36 compute-0 nova_compute[182725]: </domain>
Jan 22 22:17:36 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.939 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Preparing to wait for external event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.939 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.940 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.940 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.941 182729 DEBUG nova.virt.libvirt.vif [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:17:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1892112726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1892112726',id=14,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-m40501p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:17:34Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=eb864a01-1633-42f3-ac5f-4d664cc5d477,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.942 182729 DEBUG nova.network.os_vif_util [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converting VIF {"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.943 182729 DEBUG nova.network.os_vif_util [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.944 182729 DEBUG os_vif [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.946 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.948 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.948 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.953 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.953 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d0bf445-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.954 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1d0bf445-f7, col_values=(('external_ids', {'iface-id': '1d0bf445-f745-430d-9927-a3d8cdc9b6fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:be:3d', 'vm-uuid': 'eb864a01-1633-42f3-ac5f-4d664cc5d477'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.955 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:36 compute-0 NetworkManager[54954]: <info>  [1769120256.9570] manager: (tap1d0bf445-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.960 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.969 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:36 compute-0 nova_compute[182725]: 2026-01-22 22:17:36.972 182729 INFO os_vif [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7')
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.098 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.099 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.100 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] No VIF found with MAC fa:16:3e:83:be:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.100 182729 INFO nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Using config drive
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.379 182729 INFO nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Creating config drive at /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.389 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpybrjb0n3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.519 182729 DEBUG oslo_concurrency.processutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpybrjb0n3" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:37 compute-0 kernel: tap1d0bf445-f7: entered promiscuous mode
Jan 22 22:17:37 compute-0 NetworkManager[54954]: <info>  [1769120257.6224] manager: (tap1d0bf445-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 22 22:17:37 compute-0 ovn_controller[94850]: 2026-01-22T22:17:37Z|00051|binding|INFO|Claiming lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc for this chassis.
Jan 22 22:17:37 compute-0 ovn_controller[94850]: 2026-01-22T22:17:37Z|00052|binding|INFO|1d0bf445-f745-430d-9927-a3d8cdc9b6fc: Claiming fa:16:3e:83:be:3d 10.100.0.6
Jan 22 22:17:37 compute-0 ovn_controller[94850]: 2026-01-22T22:17:37Z|00053|binding|INFO|Claiming lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d for this chassis.
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.622 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:37 compute-0 ovn_controller[94850]: 2026-01-22T22:17:37Z|00054|binding|INFO|9927fe61-75e1-4c06-8f4c-ccc8597a433d: Claiming fa:16:3e:1b:78:60 19.80.0.76
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.645 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:78:60 19.80.0.76'], port_security=['fa:16:3e:1b:78:60 19.80.0.76'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['1d0bf445-f745-430d-9927-a3d8cdc9b6fc'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2040814554', 'neutron:cidrs': '19.80.0.76/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2040814554', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=52c62c6f-61b3-4b60-8745-b12d4e251f43, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9927fe61-75e1-4c06-8f4c-ccc8597a433d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.648 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:be:3d 10.100.0.6'], port_security=['fa:16:3e:83:be:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-835502342', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eb864a01-1633-42f3-ac5f-4d664cc5d477', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-835502342', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=1d0bf445-f745-430d-9927-a3d8cdc9b6fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.650 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9927fe61-75e1-4c06-8f4c-ccc8597a433d in datapath 1b7cb047-7415-4b9a-be62-075d33a42dfe bound to our chassis
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.653 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1b7cb047-7415-4b9a-be62-075d33a42dfe
Jan 22 22:17:37 compute-0 systemd-udevd[212841]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.675 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6481d396-9b41-439d-ae80-08aee8ae3f8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.677 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1b7cb047-71 in ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:17:37 compute-0 NetworkManager[54954]: <info>  [1769120257.6795] device (tap1d0bf445-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:17:37 compute-0 NetworkManager[54954]: <info>  [1769120257.6801] device (tap1d0bf445-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.679 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1b7cb047-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.679 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94e76bfc-3102-478a-bddf-b529a1785f89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.681 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6c4598-f5cb-4991-8e14-e567029138b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 systemd-machined[154006]: New machine qemu-6-instance-0000000e.
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.701 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[7e90b1ed-1176-4037-86e2-8c9ffe719eb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.703 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:37 compute-0 ovn_controller[94850]: 2026-01-22T22:17:37Z|00055|binding|INFO|Setting lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc ovn-installed in OVS
Jan 22 22:17:37 compute-0 ovn_controller[94850]: 2026-01-22T22:17:37Z|00056|binding|INFO|Setting lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc up in Southbound
Jan 22 22:17:37 compute-0 ovn_controller[94850]: 2026-01-22T22:17:37Z|00057|binding|INFO|Setting lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d up in Southbound
Jan 22 22:17:37 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000e.
Jan 22 22:17:37 compute-0 nova_compute[182725]: 2026-01-22 22:17:37.707 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.733 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3e005e-2279-4c93-9122-d4b19276a410]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.781 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[42ae0b37-004d-4378-96f4-290c2b01e87c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.792 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc69f77-3755-49cf-befa-9b3a289c5a33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 NetworkManager[54954]: <info>  [1769120257.7947] manager: (tap1b7cb047-70): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.847 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[07055e23-4fe2-40c7-9827-37b08c728280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.850 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[ca46a87d-bdff-4ce0-a78e-9032d1cd5ec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 NetworkManager[54954]: <info>  [1769120257.8745] device (tap1b7cb047-70): carrier: link connected
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.879 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[41f16c2c-5c22-4410-9b48-52fb8815f78d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.900 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce33588-c72d-4425-9d6c-ea26badfcac8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b7cb047-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:53:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389547, 'reachable_time': 22804, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212879, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.918 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[43008e0d-6365-4571-b27e-d446a3c585e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:534f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389547, 'tstamp': 389547}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212880, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.945 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[559a14e9-0279-4ca7-b731-e4be12367244]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b7cb047-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:53:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389547, 'reachable_time': 22804, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212881, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:37.988 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[607306b7-b023-444c-a382-c64d99fde58c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.020 182729 DEBUG nova.network.neutron [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updated VIF entry in instance network info cache for port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.021 182729 DEBUG nova.network.neutron [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updating instance_info_cache with network_info: [{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.041 182729 DEBUG nova.compute.manager [req-63ca0750-f234-4665-8722-40ada27d0475 req-5a6041c5-d5a8-4cdf-bc89-a094f90a5134 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.042 182729 DEBUG oslo_concurrency.lockutils [req-63ca0750-f234-4665-8722-40ada27d0475 req-5a6041c5-d5a8-4cdf-bc89-a094f90a5134 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.042 182729 DEBUG oslo_concurrency.lockutils [req-63ca0750-f234-4665-8722-40ada27d0475 req-5a6041c5-d5a8-4cdf-bc89-a094f90a5134 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.043 182729 DEBUG oslo_concurrency.lockutils [req-63ca0750-f234-4665-8722-40ada27d0475 req-5a6041c5-d5a8-4cdf-bc89-a094f90a5134 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.044 182729 DEBUG nova.compute.manager [req-63ca0750-f234-4665-8722-40ada27d0475 req-5a6041c5-d5a8-4cdf-bc89-a094f90a5134 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Processing event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.046 182729 DEBUG oslo_concurrency.lockutils [req-7be5ef5b-a8d6-4d7d-a119-666c299ba482 req-eb778206-6e74-45ad-9430-3720abb6c1b7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.082 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[138e9494-063a-41cb-96cd-e72775a05739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.084 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b7cb047-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.084 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.085 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b7cb047-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.087 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:38 compute-0 NetworkManager[54954]: <info>  [1769120258.0887] manager: (tap1b7cb047-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 22 22:17:38 compute-0 kernel: tap1b7cb047-70: entered promiscuous mode
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.092 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.093 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1b7cb047-70, col_values=(('external_ids', {'iface-id': 'd3f99e89-12b2-4c5f-a047-a3d3247ffb04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.095 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:38 compute-0 ovn_controller[94850]: 2026-01-22T22:17:38Z|00058|binding|INFO|Releasing lport d3f99e89-12b2-4c5f-a047-a3d3247ffb04 from this chassis (sb_readonly=0)
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.121 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.122 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1b7cb047-7415-4b9a-be62-075d33a42dfe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1b7cb047-7415-4b9a-be62-075d33a42dfe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.123 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1e30bb2b-1ed7-4529-b849-3ae854e9d9ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.124 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-1b7cb047-7415-4b9a-be62-075d33a42dfe
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/1b7cb047-7415-4b9a-be62-075d33a42dfe.pid.haproxy
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 1b7cb047-7415-4b9a-be62-075d33a42dfe
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:17:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:38.125 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'env', 'PROCESS_TAG=haproxy-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1b7cb047-7415-4b9a-be62-075d33a42dfe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.318 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120258.3173072, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.318 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Started (Lifecycle Event)
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.321 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.339 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.341 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.346 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.350 182729 INFO nova.virt.libvirt.driver [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Instance spawned successfully.
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.351 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.375 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.375 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120258.3176181, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.376 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Paused (Lifecycle Event)
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.389 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.390 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.391 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.392 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.393 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.394 182729 DEBUG nova.virt.libvirt.driver [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.404 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.409 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120258.3253155, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.410 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Resumed (Lifecycle Event)
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.440 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.448 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.473 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.474 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.492 182729 INFO nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Took 4.06 seconds to spawn the instance on the hypervisor.
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.493 182729 DEBUG nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.592 182729 INFO nova.compute.manager [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Took 4.66 seconds to build instance.
Jan 22 22:17:38 compute-0 podman[212926]: 2026-01-22 22:17:38.593804321 +0000 UTC m=+0.079653905 container create 1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.610 182729 DEBUG oslo_concurrency.lockutils [None req-8d6e302a-8b0e-4b6d-ba10-6cb30a5bf789 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:38 compute-0 podman[212926]: 2026-01-22 22:17:38.547586378 +0000 UTC m=+0.033435992 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:17:38 compute-0 systemd[1]: Started libpod-conmon-1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e.scope.
Jan 22 22:17:38 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:17:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13dce64df0d3357b7e4f06958e8b03956ec78a96390f672daecbaddf61491129/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.919 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120243.9185388, 52297cc9-cee5-40f3-a5db-e330cb26b900 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.920 182729 INFO nova.compute.manager [-] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] VM Stopped (Lifecycle Event)
Jan 22 22:17:38 compute-0 podman[212926]: 2026-01-22 22:17:38.931292788 +0000 UTC m=+0.417142462 container init 1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 22:17:38 compute-0 podman[212939]: 2026-01-22 22:17:38.939737931 +0000 UTC m=+0.307998217 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:17:38 compute-0 podman[212926]: 2026-01-22 22:17:38.941766912 +0000 UTC m=+0.427616526 container start 1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:17:38 compute-0 nova_compute[182725]: 2026-01-22 22:17:38.943 182729 DEBUG nova.compute.manager [None req-05dcfce3-2dac-4051-bbb2-835c4474e0a1 - - - - - -] [instance: 52297cc9-cee5-40f3-a5db-e330cb26b900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:38 compute-0 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[212951]: [NOTICE]   (212970) : New worker (212972) forked
Jan 22 22:17:38 compute-0 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[212951]: [NOTICE]   (212970) : Loading success.
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.132 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b unbound from our chassis
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.134 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.155 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b8f4a8-605b-4c47-a288-ce69149fc411]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.192 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3b76ca70-860c-4e41-b8dc-0bc21982d0ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.194 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e3dbb194-3390-4c8a-8b07-fe6ee4046f70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.226 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7f856e-b34a-4ac9-ac99-52499f5435be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.254 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b04d35-a84f-4ae9-a3c2-706f06f782d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387264, 'reachable_time': 35108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212986, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.280 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc3e27b-9da7-41d9-a1d1-d740b6e69bda]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0265f228-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387281, 'tstamp': 387281}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212988, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0265f228-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387286, 'tstamp': 387286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212988, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.282 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:39 compute-0 nova_compute[182725]: 2026-01-22 22:17:39.284 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:39 compute-0 nova_compute[182725]: 2026-01-22 22:17:39.286 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.287 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0265f228-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.287 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.288 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0265f228-40, col_values=(('external_ids', {'iface-id': '7a6b2843-0304-440c-ac2a-e8d7f0e704c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:39.288 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:40 compute-0 nova_compute[182725]: 2026-01-22 22:17:40.726 182729 DEBUG nova.compute.manager [req-a528c585-80ad-43d4-9d66-2a09e8733049 req-bc6ceb9e-b19e-4845-891a-df7e4dac28ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:40 compute-0 nova_compute[182725]: 2026-01-22 22:17:40.727 182729 DEBUG oslo_concurrency.lockutils [req-a528c585-80ad-43d4-9d66-2a09e8733049 req-bc6ceb9e-b19e-4845-891a-df7e4dac28ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:40 compute-0 nova_compute[182725]: 2026-01-22 22:17:40.728 182729 DEBUG oslo_concurrency.lockutils [req-a528c585-80ad-43d4-9d66-2a09e8733049 req-bc6ceb9e-b19e-4845-891a-df7e4dac28ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:40 compute-0 nova_compute[182725]: 2026-01-22 22:17:40.728 182729 DEBUG oslo_concurrency.lockutils [req-a528c585-80ad-43d4-9d66-2a09e8733049 req-bc6ceb9e-b19e-4845-891a-df7e4dac28ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:40 compute-0 nova_compute[182725]: 2026-01-22 22:17:40.728 182729 DEBUG nova.compute.manager [req-a528c585-80ad-43d4-9d66-2a09e8733049 req-bc6ceb9e-b19e-4845-891a-df7e4dac28ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:40 compute-0 nova_compute[182725]: 2026-01-22 22:17:40.728 182729 WARNING nova.compute.manager [req-a528c585-80ad-43d4-9d66-2a09e8733049 req-bc6ceb9e-b19e-4845-891a-df7e4dac28ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received unexpected event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with vm_state active and task_state None.
Jan 22 22:17:41 compute-0 nova_compute[182725]: 2026-01-22 22:17:41.958 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:43 compute-0 nova_compute[182725]: 2026-01-22 22:17:43.477 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:46 compute-0 nova_compute[182725]: 2026-01-22 22:17:46.161 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Check if temp file /var/lib/nova/instances/tmphpugygql exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 22 22:17:46 compute-0 nova_compute[182725]: 2026-01-22 22:17:46.161 182729 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphpugygql',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb864a01-1633-42f3-ac5f-4d664cc5d477',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 22 22:17:46 compute-0 nova_compute[182725]: 2026-01-22 22:17:46.964 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:48 compute-0 nova_compute[182725]: 2026-01-22 22:17:48.481 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:49 compute-0 nova_compute[182725]: 2026-01-22 22:17:49.069 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:49 compute-0 nova_compute[182725]: 2026-01-22 22:17:49.135 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:49 compute-0 nova_compute[182725]: 2026-01-22 22:17:49.136 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:17:49 compute-0 nova_compute[182725]: 2026-01-22 22:17:49.198 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:17:50 compute-0 podman[213009]: 2026-01-22 22:17:50.136159497 +0000 UTC m=+0.074272219 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:17:51 compute-0 ovn_controller[94850]: 2026-01-22T22:17:51Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:be:3d 10.100.0.6
Jan 22 22:17:51 compute-0 ovn_controller[94850]: 2026-01-22T22:17:51Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:be:3d 10.100.0.6
Jan 22 22:17:51 compute-0 nova_compute[182725]: 2026-01-22 22:17:51.969 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:52 compute-0 sshd-session[213028]: Accepted publickey for nova from 192.168.122.102 port 48324 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:17:52 compute-0 systemd-logind[801]: New session 28 of user nova.
Jan 22 22:17:52 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 22:17:52 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 22:17:52 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 22:17:52 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 22:17:52 compute-0 systemd[213032]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:17:52 compute-0 systemd[213032]: Queued start job for default target Main User Target.
Jan 22 22:17:52 compute-0 systemd[213032]: Created slice User Application Slice.
Jan 22 22:17:52 compute-0 systemd[213032]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:17:52 compute-0 systemd[213032]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:17:52 compute-0 systemd[213032]: Reached target Paths.
Jan 22 22:17:52 compute-0 systemd[213032]: Reached target Timers.
Jan 22 22:17:52 compute-0 systemd[213032]: Starting D-Bus User Message Bus Socket...
Jan 22 22:17:52 compute-0 systemd[213032]: Starting Create User's Volatile Files and Directories...
Jan 22 22:17:52 compute-0 systemd[213032]: Finished Create User's Volatile Files and Directories.
Jan 22 22:17:52 compute-0 systemd[213032]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:17:52 compute-0 systemd[213032]: Reached target Sockets.
Jan 22 22:17:52 compute-0 systemd[213032]: Reached target Basic System.
Jan 22 22:17:52 compute-0 systemd[213032]: Reached target Main User Target.
Jan 22 22:17:52 compute-0 systemd[213032]: Startup finished in 163ms.
Jan 22 22:17:52 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 22:17:52 compute-0 systemd[1]: Started Session 28 of User nova.
Jan 22 22:17:52 compute-0 sshd-session[213028]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:17:52 compute-0 sshd-session[213047]: Received disconnect from 192.168.122.102 port 48324:11: disconnected by user
Jan 22 22:17:52 compute-0 sshd-session[213047]: Disconnected from user nova 192.168.122.102 port 48324
Jan 22 22:17:52 compute-0 sshd-session[213028]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:17:52 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Jan 22 22:17:52 compute-0 systemd-logind[801]: Session 28 logged out. Waiting for processes to exit.
Jan 22 22:17:52 compute-0 systemd-logind[801]: Removed session 28.
Jan 22 22:17:53 compute-0 nova_compute[182725]: 2026-01-22 22:17:53.483 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:54 compute-0 podman[213050]: 2026-01-22 22:17:54.143153438 +0000 UTC m=+0.067094488 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible)
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.142 182729 DEBUG nova.compute.manager [req-439a50c4-9d44-43f2-ba47-3570bdd927c4 req-5754ffe5-ce12-4d92-88c0-a29e9bbc1f4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.142 182729 DEBUG oslo_concurrency.lockutils [req-439a50c4-9d44-43f2-ba47-3570bdd927c4 req-5754ffe5-ce12-4d92-88c0-a29e9bbc1f4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.143 182729 DEBUG oslo_concurrency.lockutils [req-439a50c4-9d44-43f2-ba47-3570bdd927c4 req-5754ffe5-ce12-4d92-88c0-a29e9bbc1f4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.143 182729 DEBUG oslo_concurrency.lockutils [req-439a50c4-9d44-43f2-ba47-3570bdd927c4 req-5754ffe5-ce12-4d92-88c0-a29e9bbc1f4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.143 182729 DEBUG nova.compute.manager [req-439a50c4-9d44-43f2-ba47-3570bdd927c4 req-5754ffe5-ce12-4d92-88c0-a29e9bbc1f4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.143 182729 DEBUG nova.compute.manager [req-439a50c4-9d44-43f2-ba47-3570bdd927c4 req-5754ffe5-ce12-4d92-88c0-a29e9bbc1f4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:17:54 compute-0 podman[213049]: 2026-01-22 22:17:54.199972707 +0000 UTC m=+0.123838605 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.706 182729 INFO nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Took 5.51 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.707 182729 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.722 182729 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphpugygql',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb864a01-1633-42f3-ac5f-4d664cc5d477',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(3f8b1047-5c0f-43aa-8c73-715bbf081990),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.755 182729 DEBUG nova.objects.instance [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'migration_context' on Instance uuid eb864a01-1633-42f3-ac5f-4d664cc5d477 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.757 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.759 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.759 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.778 182729 DEBUG nova.virt.libvirt.vif [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:17:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1892112726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1892112726',id=14,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:17:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-m40501p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:17:38Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=eb864a01-1633-42f3-ac5f-4d664cc5d477,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.779 182729 DEBUG nova.network.os_vif_util [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.780 182729 DEBUG nova.network.os_vif_util [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.781 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updating guest XML with vif config: <interface type="ethernet">
Jan 22 22:17:54 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:83:be:3d"/>
Jan 22 22:17:54 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:17:54 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:17:54 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:17:54 compute-0 nova_compute[182725]:   <target dev="tap1d0bf445-f7"/>
Jan 22 22:17:54 compute-0 nova_compute[182725]: </interface>
Jan 22 22:17:54 compute-0 nova_compute[182725]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 22 22:17:54 compute-0 nova_compute[182725]: 2026-01-22 22:17:54.782 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 22 22:17:55 compute-0 nova_compute[182725]: 2026-01-22 22:17:55.262 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:17:55 compute-0 nova_compute[182725]: 2026-01-22 22:17:55.264 182729 INFO nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 22 22:17:55 compute-0 nova_compute[182725]: 2026-01-22 22:17:55.355 182729 INFO nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 22 22:17:55 compute-0 nova_compute[182725]: 2026-01-22 22:17:55.858 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:17:55 compute-0 nova_compute[182725]: 2026-01-22 22:17:55.859 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.363 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.365 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.414 182729 DEBUG nova.compute.manager [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.415 182729 DEBUG oslo_concurrency.lockutils [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.415 182729 DEBUG oslo_concurrency.lockutils [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.415 182729 DEBUG oslo_concurrency.lockutils [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.415 182729 DEBUG nova.compute.manager [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.416 182729 WARNING nova.compute.manager [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received unexpected event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with vm_state active and task_state migrating.
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.416 182729 DEBUG nova.compute.manager [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-changed-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.416 182729 DEBUG nova.compute.manager [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Refreshing instance network info cache due to event network-changed-1d0bf445-f745-430d-9927-a3d8cdc9b6fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.416 182729 DEBUG oslo_concurrency.lockutils [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.417 182729 DEBUG oslo_concurrency.lockutils [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.417 182729 DEBUG nova.network.neutron [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Refreshing network info cache for port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.693 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120276.692897, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.694 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Paused (Lifecycle Event)
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.713 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.718 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.747 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.869 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.870 182729 DEBUG nova.virt.libvirt.migration [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:17:56 compute-0 kernel: tap1d0bf445-f7 (unregistering): left promiscuous mode
Jan 22 22:17:56 compute-0 NetworkManager[54954]: <info>  [1769120276.9149] device (tap1d0bf445-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.923 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:56 compute-0 ovn_controller[94850]: 2026-01-22T22:17:56Z|00059|binding|INFO|Releasing lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc from this chassis (sb_readonly=0)
Jan 22 22:17:56 compute-0 ovn_controller[94850]: 2026-01-22T22:17:56Z|00060|binding|INFO|Setting lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc down in Southbound
Jan 22 22:17:56 compute-0 ovn_controller[94850]: 2026-01-22T22:17:56Z|00061|binding|INFO|Releasing lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d from this chassis (sb_readonly=0)
Jan 22 22:17:56 compute-0 ovn_controller[94850]: 2026-01-22T22:17:56Z|00062|binding|INFO|Setting lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d down in Southbound
Jan 22 22:17:56 compute-0 ovn_controller[94850]: 2026-01-22T22:17:56Z|00063|binding|INFO|Removing iface tap1d0bf445-f7 ovn-installed in OVS
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.926 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.945 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:56 compute-0 ovn_controller[94850]: 2026-01-22T22:17:56Z|00064|binding|INFO|Releasing lport d3f99e89-12b2-4c5f-a047-a3d3247ffb04 from this chassis (sb_readonly=0)
Jan 22 22:17:56 compute-0 ovn_controller[94850]: 2026-01-22T22:17:56Z|00065|binding|INFO|Releasing lport 7a6b2843-0304-440c-ac2a-e8d7f0e704c0 from this chassis (sb_readonly=0)
Jan 22 22:17:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:56.967 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:78:60 19.80.0.76'], port_security=['fa:16:3e:1b:78:60 19.80.0.76'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['1d0bf445-f745-430d-9927-a3d8cdc9b6fc'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2040814554', 'neutron:cidrs': '19.80.0.76/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2040814554', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=52c62c6f-61b3-4b60-8745-b12d4e251f43, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9927fe61-75e1-4c06-8f4c-ccc8597a433d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:17:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:56.970 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:be:3d 10.100.0.6'], port_security=['fa:16:3e:83:be:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e130c2ec-fef7-4ed2-892d-1e3d7eaab401'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-835502342', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eb864a01-1633-42f3-ac5f-4d664cc5d477', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-835502342', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=1d0bf445-f745-430d-9927-a3d8cdc9b6fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:17:56 compute-0 nova_compute[182725]: 2026-01-22 22:17:56.971 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:56.971 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9927fe61-75e1-4c06-8f4c-ccc8597a433d in datapath 1b7cb047-7415-4b9a-be62-075d33a42dfe unbound from our chassis
Jan 22 22:17:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:56.973 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1b7cb047-7415-4b9a-be62-075d33a42dfe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:17:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:56.975 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fde33340-64d3-43f8-9b97-0638f0472f47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:56.976 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe namespace which is not needed anymore
Jan 22 22:17:57 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 22 22:17:57 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000e.scope: Consumed 13.429s CPU time.
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.004 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:57 compute-0 systemd-machined[154006]: Machine qemu-6-instance-0000000e terminated.
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.163 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.164 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.164 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 22 22:17:57 compute-0 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[212951]: [NOTICE]   (212970) : haproxy version is 2.8.14-c23fe91
Jan 22 22:17:57 compute-0 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[212951]: [NOTICE]   (212970) : path to executable is /usr/sbin/haproxy
Jan 22 22:17:57 compute-0 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[212951]: [WARNING]  (212970) : Exiting Master process...
Jan 22 22:17:57 compute-0 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[212951]: [ALERT]    (212970) : Current worker (212972) exited with code 143 (Terminated)
Jan 22 22:17:57 compute-0 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[212951]: [WARNING]  (212970) : All workers exited. Exiting... (0)
Jan 22 22:17:57 compute-0 systemd[1]: libpod-1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e.scope: Deactivated successfully.
Jan 22 22:17:57 compute-0 podman[213122]: 2026-01-22 22:17:57.190998788 +0000 UTC m=+0.102066678 container died 1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:17:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e-userdata-shm.mount: Deactivated successfully.
Jan 22 22:17:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-13dce64df0d3357b7e4f06958e8b03956ec78a96390f672daecbaddf61491129-merged.mount: Deactivated successfully.
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.373 182729 DEBUG nova.virt.libvirt.guest [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'eb864a01-1633-42f3-ac5f-4d664cc5d477' (instance-0000000e) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.374 182729 INFO nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Migration operation has completed
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.375 182729 INFO nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] _post_live_migration() is started..
Jan 22 22:17:57 compute-0 podman[213122]: 2026-01-22 22:17:57.423896265 +0000 UTC m=+0.334964195 container cleanup 1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:17:57 compute-0 systemd[1]: libpod-conmon-1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e.scope: Deactivated successfully.
Jan 22 22:17:57 compute-0 podman[213169]: 2026-01-22 22:17:57.68890054 +0000 UTC m=+0.234270673 container remove 1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.698 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad19558-1a0b-4ddd-b34b-5687e3a2b327]: (4, ('Thu Jan 22 10:17:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe (1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e)\n1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e\nThu Jan 22 10:17:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe (1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e)\n1d3c62f224b2225bd8dc98b827b3d572f5e7feb11524c88067173046f279171e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.700 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[80be83bd-833a-4664-afa3-33b0e2f212ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.701 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b7cb047-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.703 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:57 compute-0 kernel: tap1b7cb047-70: left promiscuous mode
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.731 182729 DEBUG nova.compute.manager [req-01342b40-4d6a-4eb1-ae7a-60afe866121e req-f66e1c1d-1fab-457c-bab2-e7b186a690a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.732 182729 DEBUG oslo_concurrency.lockutils [req-01342b40-4d6a-4eb1-ae7a-60afe866121e req-f66e1c1d-1fab-457c-bab2-e7b186a690a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.732 182729 DEBUG oslo_concurrency.lockutils [req-01342b40-4d6a-4eb1-ae7a-60afe866121e req-f66e1c1d-1fab-457c-bab2-e7b186a690a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.733 182729 DEBUG oslo_concurrency.lockutils [req-01342b40-4d6a-4eb1-ae7a-60afe866121e req-f66e1c1d-1fab-457c-bab2-e7b186a690a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.733 182729 DEBUG nova.compute.manager [req-01342b40-4d6a-4eb1-ae7a-60afe866121e req-f66e1c1d-1fab-457c-bab2-e7b186a690a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.734 182729 DEBUG nova.compute.manager [req-01342b40-4d6a-4eb1-ae7a-60afe866121e req-f66e1c1d-1fab-457c-bab2-e7b186a690a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.735 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.738 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[90807022-355d-45f7-8deb-492df02e07e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.753 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6f312f13-b967-446f-ade9-d4f05ba1ae61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.755 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2954dbd9-1b7a-4b37-8805-86012d0d9d06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.774 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f03d6007-bc5e-4cbb-9e2d-5fbea5d0a846]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389537, 'reachable_time': 23316, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213188, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d1b7cb047\x2d7415\x2d4b9a\x2dbe62\x2d075d33a42dfe.mount: Deactivated successfully.
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.778 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.780 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[f413b24d-cf07-4222-ae63-2297eae590a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.781 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b unbound from our chassis
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.783 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.807 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f84e82-ffa4-4089-8509-7dc198c29b87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.849 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d92e3d6d-80bf-439e-aa91-5c30459602a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.853 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b6296e00-6705-465e-8c2f-0f4615a52bfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.895 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[302fa733-32de-4d47-a8d0-74f604e4d1f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.921 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d247b0fa-8431-4c89-a84f-31473d42ae49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 22, 'tx_packets': 7, 'rx_bytes': 1204, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 22, 'tx_packets': 7, 'rx_bytes': 1204, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387264, 'reachable_time': 35108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213194, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.948 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6917d596-4b49-4dd4-9c7d-e728119458e3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0265f228-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387281, 'tstamp': 387281}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213195, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0265f228-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387286, 'tstamp': 387286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213195, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.950 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.953 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:57 compute-0 nova_compute[182725]: 2026-01-22 22:17:57.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.959 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0265f228-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.959 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.960 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0265f228-40, col_values=(('external_ids', {'iface-id': '7a6b2843-0304-440c-ac2a-e8d7f0e704c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:17:57.960 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.486 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.588 182729 DEBUG nova.compute.manager [req-2174cf0f-003a-49bf-bf27-0d77e3ac5608 req-93da917d-880f-4e15-8dbe-71e89b7adf36 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.589 182729 DEBUG oslo_concurrency.lockutils [req-2174cf0f-003a-49bf-bf27-0d77e3ac5608 req-93da917d-880f-4e15-8dbe-71e89b7adf36 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.589 182729 DEBUG oslo_concurrency.lockutils [req-2174cf0f-003a-49bf-bf27-0d77e3ac5608 req-93da917d-880f-4e15-8dbe-71e89b7adf36 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.589 182729 DEBUG oslo_concurrency.lockutils [req-2174cf0f-003a-49bf-bf27-0d77e3ac5608 req-93da917d-880f-4e15-8dbe-71e89b7adf36 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.590 182729 DEBUG nova.compute.manager [req-2174cf0f-003a-49bf-bf27-0d77e3ac5608 req-93da917d-880f-4e15-8dbe-71e89b7adf36 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.590 182729 DEBUG nova.compute.manager [req-2174cf0f-003a-49bf-bf27-0d77e3ac5608 req-93da917d-880f-4e15-8dbe-71e89b7adf36 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.592 182729 DEBUG nova.network.neutron [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updated VIF entry in instance network info cache for port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.593 182729 DEBUG nova.network.neutron [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updating instance_info_cache with network_info: [{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:17:58 compute-0 nova_compute[182725]: 2026-01-22 22:17:58.618 182729 DEBUG oslo_concurrency.lockutils [req-9aa310bd-751c-4ac6-aa8d-db06d1ebd487 req-682b196f-b940-442e-9097-6cb6c89feaa4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.104 182729 DEBUG nova.network.neutron [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Activated binding for port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.105 182729 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.106 182729 DEBUG nova.virt.libvirt.vif [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:17:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1892112726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1892112726',id=14,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:17:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-m40501p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:17:45Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=eb864a01-1633-42f3-ac5f-4d664cc5d477,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.107 182729 DEBUG nova.network.os_vif_util [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.108 182729 DEBUG nova.network.os_vif_util [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.109 182729 DEBUG os_vif [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.111 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.112 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d0bf445-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.114 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.118 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.122 182729 INFO os_vif [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7')
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.123 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.124 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.124 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.125 182729 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.125 182729 INFO nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Deleting instance files /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477_del
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.126 182729 INFO nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Deletion of /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477_del complete
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.886 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.886 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.887 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.887 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.887 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.887 182729 WARNING nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received unexpected event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with vm_state active and task_state migrating.
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.888 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.888 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.888 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.889 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.889 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.889 182729 WARNING nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received unexpected event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with vm_state active and task_state migrating.
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.889 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.890 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.890 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.890 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.891 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.891 182729 WARNING nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received unexpected event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with vm_state active and task_state migrating.
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.891 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.891 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.892 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.892 182729 DEBUG oslo_concurrency.lockutils [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.892 182729 DEBUG nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:17:59 compute-0 nova_compute[182725]: 2026-01-22 22:17:59.892 182729 WARNING nova.compute.manager [req-da7623d9-41fc-4b14-bb3c-e52aa6413155 req-a746319c-1986-45f8-82ac-56bcd0cf0532 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received unexpected event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with vm_state active and task_state migrating.
Jan 22 22:18:02 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 22:18:02 compute-0 systemd[213032]: Activating special unit Exit the Session...
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped target Main User Target.
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped target Basic System.
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped target Paths.
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped target Sockets.
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped target Timers.
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:18:02 compute-0 systemd[213032]: Closed D-Bus User Message Bus Socket.
Jan 22 22:18:02 compute-0 systemd[213032]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:18:02 compute-0 systemd[213032]: Removed slice User Application Slice.
Jan 22 22:18:02 compute-0 systemd[213032]: Reached target Shutdown.
Jan 22 22:18:02 compute-0 systemd[213032]: Finished Exit the Session.
Jan 22 22:18:02 compute-0 systemd[213032]: Reached target Exit the Session.
Jan 22 22:18:02 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 22:18:02 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 22:18:02 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 22:18:02 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 22:18:02 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 22:18:02 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 22:18:02 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 22:18:02 compute-0 nova_compute[182725]: 2026-01-22 22:18:02.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:02 compute-0 nova_compute[182725]: 2026-01-22 22:18:02.892 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:18:02 compute-0 nova_compute[182725]: 2026-01-22 22:18:02.892 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:18:03 compute-0 nova_compute[182725]: 2026-01-22 22:18:03.398 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:18:03 compute-0 nova_compute[182725]: 2026-01-22 22:18:03.399 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:18:03 compute-0 nova_compute[182725]: 2026-01-22 22:18:03.399 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:18:03 compute-0 nova_compute[182725]: 2026-01-22 22:18:03.399 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:18:03 compute-0 nova_compute[182725]: 2026-01-22 22:18:03.489 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.115 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:04 compute-0 podman[213200]: 2026-01-22 22:18:04.159420485 +0000 UTC m=+0.078668289 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:18:04 compute-0 podman[213199]: 2026-01-22 22:18:04.191867241 +0000 UTC m=+0.116842449 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.642 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.643 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.643 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.672 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.674 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.674 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.675 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.744 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.837 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.838 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:04 compute-0 nova_compute[182725]: 2026-01-22 22:18:04.909 182729 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.119 182729 WARNING nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.121 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5590MB free_disk=73.3523063659668GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.121 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.121 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.123 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.148 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.148 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.149 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.171 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Migration for instance eb864a01-1633-42f3-ac5f-4d664cc5d477 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.192 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.219 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Instance 1c2458ea-22d6-480f-ae75-5f050eb08b2b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.219 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Migration 3f8b1047-5c0f-43aa-8c73-715bbf081990 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.220 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.220 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.288 182729 DEBUG nova.compute.provider_tree [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.303 182729 DEBUG nova.scheduler.client.report [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.337 182729 DEBUG nova.compute.resource_tracker [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.338 182729 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.352 182729 INFO nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.504 182729 INFO nova.scheduler.client.report [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Deleted allocation for migration 3f8b1047-5c0f-43aa-8c73-715bbf081990
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.505 182729 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.914 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.914 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:05 compute-0 nova_compute[182725]: 2026-01-22 22:18:05.915 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.013 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.087 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.088 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.183 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.419 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.421 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5588MB free_disk=73.3523063659668GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.421 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.421 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.478 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 1c2458ea-22d6-480f-ae75-5f050eb08b2b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.479 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.479 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.522 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.536 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.538 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:18:06 compute-0 nova_compute[182725]: 2026-01-22 22:18:06.538 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:07 compute-0 nova_compute[182725]: 2026-01-22 22:18:07.539 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:07 compute-0 nova_compute[182725]: 2026-01-22 22:18:07.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:07 compute-0 nova_compute[182725]: 2026-01-22 22:18:07.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:18:08 compute-0 nova_compute[182725]: 2026-01-22 22:18:08.491 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:08 compute-0 nova_compute[182725]: 2026-01-22 22:18:08.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:08 compute-0 nova_compute[182725]: 2026-01-22 22:18:08.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:09 compute-0 nova_compute[182725]: 2026-01-22 22:18:09.117 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:09 compute-0 podman[213255]: 2026-01-22 22:18:09.12289301 +0000 UTC m=+0.060577235 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.460 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}5d185c1c3e6f8459a59d0666531579b8e37af0b60f5d8d565a43e1812b325b82" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.540 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 22 Jan 2026 22:18:09 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-b91949c1-4f40-4d60-a639-7ceb915fa870 x-openstack-request-id: req-b91949c1-4f40-4d60-a639-7ceb915fa870 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.540 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "617fb2f8-2c15-4939-a64a-90fca4acd12a", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/617fb2f8-2c15-4939-a64a-90fca4acd12a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/617fb2f8-2c15-4939-a64a-90fca4acd12a"}]}, {"id": "63b0d901-60c2-48cb-afeb-72a71e897d3d", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/63b0d901-60c2-48cb-afeb-72a71e897d3d"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/63b0d901-60c2-48cb-afeb-72a71e897d3d"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.540 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-b91949c1-4f40-4d60-a639-7ceb915fa870 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.542 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/63b0d901-60c2-48cb-afeb-72a71e897d3d -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}5d185c1c3e6f8459a59d0666531579b8e37af0b60f5d8d565a43e1812b325b82" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.609 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Thu, 22 Jan 2026 22:18:09 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ebe9b8c0-8636-4fbd-b545-1ac11f848f02 x-openstack-request-id: req-ebe9b8c0-8636-4fbd-b545-1ac11f848f02 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.609 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "63b0d901-60c2-48cb-afeb-72a71e897d3d", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/63b0d901-60c2-48cb-afeb-72a71e897d3d"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/63b0d901-60c2-48cb-afeb-72a71e897d3d"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.609 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/63b0d901-60c2-48cb-afeb-72a71e897d3d used request id req-ebe9b8c0-8636-4fbd-b545-1ac11f848f02 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.611 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4ff5f7f17f1c471986dfd67f5192359f', 'user_id': 'f591d36af603475bbc613d6c93854a42', 'hostId': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.611 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.634 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/cpu volume: 360000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cd0f452-df4b-4dd5-a17d-b24e9ea1c3ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 360000000, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'timestamp': '2026-01-22T22:18:09.611462', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3bcaa4da-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.291825158, 'message_signature': 'fbbe4b1b93d63c8d9ebf0d27e2d6236f992614ef7743e2df645b2305684bc52f'}]}, 'timestamp': '2026-01-22 22:18:09.635992', '_unique_id': 'd765f52218bd478a9ea274a523b89c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.640 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.643 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.682 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.write.bytes volume: 126976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.682 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '760f8e1f-995d-4353-ab4c-a199e68265ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 126976, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.643509', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bd1ccc4-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '398306381d485b5f907a89c00ac58647d5d7f8b2fbbf368cd6c50dedf518143c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.643509', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bd1dbf6-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '3c649bcf9895350b789ba43e5923e53f8d2992e1b829031dce8852b383342363'}]}, 'timestamp': '2026-01-22 22:18:09.683096', '_unique_id': '5897a7e994be44ac905019a1500d1a73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.684 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.686 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.690 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1c2458ea-22d6-480f-ae75-5f050eb08b2b / tap3cbb0272-18 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.690 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3df736f1-f67e-4cdd-a2ca-61787d10261f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.686208', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bd320c4-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': 'a67d6df9799687ad722201e7d18a561991dfaabeaaae04d8d3d08e5d77cf0bd3'}]}, 'timestamp': '2026-01-22 22:18:09.691728', '_unique_id': '8db6ea28de1b4579bc8e67cc39cff9cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.693 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.694 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.706 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.707 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b292e798-62a0-47ed-b26e-3d5409f3d42f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.694912', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bd586de-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.352670588, 'message_signature': 'd800e7b8eb83e02ac9a0d680023634ecbde06c57a9f3c4544625fe5498d1e8f6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.694912', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bd59ff2-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.352670588, 'message_signature': '2815d374e081beb3e551b1a08de81be6ff1e3370322e7974c8c7278e90ca26a8'}]}, 'timestamp': '2026-01-22 22:18:09.707986', '_unique_id': 'a9965eea5f274cf68f09f5621d056ad1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.709 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.711 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.incoming.bytes volume: 1714 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebd5ea25-539f-4937-b603-62590ea5e5d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1714, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.711112', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bd64b82-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': 'd93d0e48bb91dee12d74d45733bc80f52fb5afa7ff353b32ba5599af1f7292ad'}]}, 'timestamp': '2026-01-22 22:18:09.712342', '_unique_id': '2e20322c13184afaaff68f32f9b5dac3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.713 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.714 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.write.requests volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.715 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a7319f1-20fd-4d0e-bb74-ed133c1183be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 19, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.714664', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bd6bb26-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '019368fa9b7015deae7b3e78744b2f9e06c6657c76e0da206a983085a4012a2d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.714664', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bd6c83c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '0fae3a5bf786ab65bb7a7b4858728a0bbe9bece03450f46a81f84c4c139e68ed'}]}, 'timestamp': '2026-01-22 22:18:09.715355', '_unique_id': '7d1d2fa10750430f852300295f6a9bd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.716 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.717 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dce56ab9-a49a-484d-bab0-2f5641524c07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.717128', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bd71ad0-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': '0def2bb1de9c234554b5afc0e1fd7466dfd37606c763f54cc2d8e8e0348242eb'}]}, 'timestamp': '2026-01-22 22:18:09.717493', '_unique_id': 'f7728d32865d496e9e3250011370bcbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.718 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.719 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2071b1c-cae6-4d6c-876c-539aa5120c83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.719299', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bd7703e-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': '1117c75d161372257119e083bdbfd49870d2769c0cf3ad2ffed8ea296c1c45e6'}]}, 'timestamp': '2026-01-22 22:18:09.719671', '_unique_id': 'e26af12df40840f2a33b6ec3de52689f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.720 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.721 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.721 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>]
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.722 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.722 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43bf5a1c-9f8f-4248-b902-4b1c95736118', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.722023', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bd7d970-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '72e02326ac4cd1e07f375b8db79fe364b48815e46f4702684b4f1c1fe8f1d8d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.722023', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bd7e5d2-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': 'e9a741443651116d2e891ab3f1ea4c238cd74b1ef76e139506177f46cf29f3a6'}]}, 'timestamp': '2026-01-22 22:18:09.722673', '_unique_id': 'a01ae5b7cc8a4e4da124b6a5ee2a1e79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.723 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.724 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.724 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>]
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.724 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.724 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/memory.usage volume: 42.68359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f49000d-e62a-460c-bf66-54fcb0090d2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.68359375, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'timestamp': '2026-01-22T22:18:09.724691', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3bd841bc-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.291825158, 'message_signature': '0b2d774dcc7906e9f08be1e5c67ae8d7dcf9647290d70a5d645626463f932a83'}]}, 'timestamp': '2026-01-22 22:18:09.725006', '_unique_id': '90e61ba1921b4e679234a4e1206a9070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.725 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.726 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08c7992f-1710-4a19-932c-1ccb6f9b090f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.726547', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bd889e2-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': '3e45cb339bbab394e569998516794f1ee7382909bf4346657a472ed48ce4e301'}]}, 'timestamp': '2026-01-22 22:18:09.726894', '_unique_id': '49ecd5125416485ba2e47460ec695018'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.727 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.728 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.728 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.728 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48fd92b2-5b02-4a49-ac66-afae7492734f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.728346', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bd8cfba-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.352670588, 'message_signature': '6dd65517c78e6748820f8ce5f129d42e2416154834c9522693eeb0ccd92ce554'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.728346', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bd8da46-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.352670588, 'message_signature': 'a47a97b80827f81f7861deb6e0571cf9b9ea19d9d880db6c5c9536ec4e472f3f'}]}, 'timestamp': '2026-01-22 22:18:09.728927', '_unique_id': '00c4dc8182cf470a989791a96a297c8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.729 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.730 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.730 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7df96937-b071-4388-975c-d34f89ff0738', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.730499', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bd92442-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': 'c2362b37edae429adc583e5dd7136b05414c566ca58418d1622cd5d16af45969'}]}, 'timestamp': '2026-01-22 22:18:09.730846', '_unique_id': '033e84e5ea264b1d8087756e12e91ef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.731 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.732 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.732 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.732 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cb10388-c0e1-461b-9c20-6e6a5aad45a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.732315', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bd96a38-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '9a4f50dd2705c3cfc06b920a05b61648127c01eeb0b29c5dcdf2121cb0106e09'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.732315', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bd973c0-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '309acd5b46929b2e2d9b542911494657c2c8d89a038132a9876e0998ba5b97ba'}]}, 'timestamp': '2026-01-22 22:18:09.732852', '_unique_id': 'c0bcd574fe9e449c99710015313b5f27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.733 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.734 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.734 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.734 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '876d88b1-632b-4b07-805a-2337562764d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.734317', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bd9b970-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.352670588, 'message_signature': '69be2bfe691787df80ad5a61271b541a34e10dddae4fc746d70e1e718ed4bcbb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.734317', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bd9c492-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.352670588, 'message_signature': '35037ffdbd144128174bb55adbfaba1c8e144e02aaba7fafe3ddb6b51a90dabd'}]}, 'timestamp': '2026-01-22 22:18:09.734932', '_unique_id': '1c9e066a844f44dd93e33cc871f80801'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.735 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.736 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87aa064a-5ea3-4ba8-9d42-5c3a36bef021', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.736646', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bda14c4-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': 'd86e0e2a76482d1c4f859b24c4e8c0ba27b24c7ff9169add455515e8fe1663b3'}]}, 'timestamp': '2026-01-22 22:18:09.736974', '_unique_id': '96e1c23de53846d1b4343c6d4a2697a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.737 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.738 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.738 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5a7f5e7-5162-422b-87ef-b5c8d6d9c565', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.738416', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bda5902-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': 'b54e2b8cd8ec84e0ef3d4ce882010503b57fb2aa03f2532f149e8c59e7bdecbd'}]}, 'timestamp': '2026-01-22 22:18:09.738720', '_unique_id': '3a7705abf9cc44e4889cb717fa9ba947'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.739 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.740 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.740 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>]
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.740 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.740 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.741 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-489483157>]
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.741 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.741 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.write.latency volume: 16973273 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.741 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5abb96da-4e29-47ff-8898-2747ef22b8d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16973273, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.741280', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bdacaa4-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': 'e74b26d42731f3d3b4de2c2d98978ce204f71944299cc9ab0f7b1eb05e0e651d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.741280', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bdadfda-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '64bd9d2d0533fec3c98f0578f1ea86a626ed1b9388379a8a93ff42566c2aa6ee'}]}, 'timestamp': '2026-01-22 22:18:09.742169', '_unique_id': 'ac5bbd354cad45e080c7f7dbf68e82dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.744 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.incoming.packets volume: 33 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd6dd1ed-f689-402b-9915-17c555250ec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 33, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.744064', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bdb36ce-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': '6381e4d061918baedd838f21d70f5a17520654c76e6c99d469ccb9cf2161b560'}]}, 'timestamp': '2026-01-22 22:18:09.744444', '_unique_id': '28fd1a3d242343e988b4af80cbc53063'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.746 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '206e5cd9-7cf9-46a1-9e08-e2e7d905e94b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': 'instance-00000007-1c2458ea-22d6-480f-ae75-5f050eb08b2b-tap3cbb0272-18', 'timestamp': '2026-01-22T22:18:09.746164', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'tap3cbb0272-18', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:6c:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3cbb0272-18'}, 'message_id': '3bdb893a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.343945208, 'message_signature': '1903cf358c02809ebb8f3787e153ef4dcbafebe596fe135f2065f4c5d200fdba'}]}, 'timestamp': '2026-01-22 22:18:09.746527', '_unique_id': 'a9f44044a6c34b24bcaa88d3bde6328e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.747 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.748 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.748 12 DEBUG ceilometer.compute.pollsters [-] 1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4834b5d1-c691-4819-85b1-54b0a19f6b69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-vda', 'timestamp': '2026-01-22T22:18:09.748233', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3bdbd9a8-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': 'ec553fd5a16701e2fe761adf760092d6a7d774146ef0e4521b2b2f178911d6a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f591d36af603475bbc613d6c93854a42', 'user_name': None, 'project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'project_name': None, 'resource_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b-sda', 'timestamp': '2026-01-22T22:18:09.748233', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-489483157', 'name': 'instance-00000007', 'instance_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'instance_type': 'm1.nano', 'host': '7b606a46b3baf2c7aa065f68318c5d5241326fb145d24c4b12d806b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3bdbe696-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 3927.300899856, 'message_signature': '2f67f95d499ecf980bbb3bc72206d710e1016cc60060a69f181832e962a099b0'}]}, 'timestamp': '2026-01-22 22:18:09.748934', '_unique_id': '39cb6f4464ac4742818938de47e656bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:18:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:18:09.749 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:18:09 compute-0 nova_compute[182725]: 2026-01-22 22:18:09.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:18:12 compute-0 nova_compute[182725]: 2026-01-22 22:18:12.162 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120277.1609573, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:12 compute-0 nova_compute[182725]: 2026-01-22 22:18:12.163 182729 INFO nova.compute.manager [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Stopped (Lifecycle Event)
Jan 22 22:18:12 compute-0 nova_compute[182725]: 2026-01-22 22:18:12.187 182729 DEBUG nova.compute.manager [None req-532d6fd0-cff2-4190-a710-9353cb996587 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:12.425 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:12.426 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:12.426 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.229 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.230 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.230 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.230 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.231 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.244 182729 INFO nova.compute.manager [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Terminating instance
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.260 182729 DEBUG nova.compute.manager [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:18:13 compute-0 kernel: tap3cbb0272-18 (unregistering): left promiscuous mode
Jan 22 22:18:13 compute-0 NetworkManager[54954]: <info>  [1769120293.2875] device (tap3cbb0272-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:18:13 compute-0 ovn_controller[94850]: 2026-01-22T22:18:13Z|00066|binding|INFO|Releasing lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 from this chassis (sb_readonly=0)
Jan 22 22:18:13 compute-0 ovn_controller[94850]: 2026-01-22T22:18:13Z|00067|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 down in Southbound
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.293 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 ovn_controller[94850]: 2026-01-22T22:18:13Z|00068|binding|INFO|Removing iface tap3cbb0272-18 ovn-installed in OVS
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.298 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.309 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:6c:2e 10.100.0.9'], port_security=['fa:16:3e:8f:6c:2e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=3cbb0272-18e2-4845-aa69-d6a35ecb0d03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.311 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b unbound from our chassis
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.310 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.312 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0265f228-4e11-4f15-8d77-6acb409f3f7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.313 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[faac8baa-0806-4ca2-8ae9-d49b8a02e4fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.314 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace which is not needed anymore
Jan 22 22:18:13 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 22 22:18:13 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 4.293s CPU time.
Jan 22 22:18:13 compute-0 systemd-machined[154006]: Machine qemu-4-instance-00000007 terminated.
Jan 22 22:18:13 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212636]: [NOTICE]   (212640) : haproxy version is 2.8.14-c23fe91
Jan 22 22:18:13 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212636]: [NOTICE]   (212640) : path to executable is /usr/sbin/haproxy
Jan 22 22:18:13 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212636]: [WARNING]  (212640) : Exiting Master process...
Jan 22 22:18:13 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212636]: [ALERT]    (212640) : Current worker (212642) exited with code 143 (Terminated)
Jan 22 22:18:13 compute-0 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212636]: [WARNING]  (212640) : All workers exited. Exiting... (0)
Jan 22 22:18:13 compute-0 systemd[1]: libpod-bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08.scope: Deactivated successfully.
Jan 22 22:18:13 compute-0 podman[213305]: 2026-01-22 22:18:13.467401439 +0000 UTC m=+0.053288321 container died bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.493 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08-userdata-shm.mount: Deactivated successfully.
Jan 22 22:18:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-529a7993d85d92e3d3d3bca4c8bf362180b6b9c90e484cf8fe467e0d8afda0f4-merged.mount: Deactivated successfully.
Jan 22 22:18:13 compute-0 podman[213305]: 2026-01-22 22:18:13.518961145 +0000 UTC m=+0.104847987 container cleanup bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:18:13 compute-0 systemd[1]: libpod-conmon-bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08.scope: Deactivated successfully.
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.529 182729 DEBUG nova.compute.manager [req-e4ceffb1-60e5-4dfc-ba97-6721cee5ca9e req-7dd41a71-9fa3-41ec-994c-ed56f7d32e70 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.530 182729 DEBUG oslo_concurrency.lockutils [req-e4ceffb1-60e5-4dfc-ba97-6721cee5ca9e req-7dd41a71-9fa3-41ec-994c-ed56f7d32e70 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.531 182729 DEBUG oslo_concurrency.lockutils [req-e4ceffb1-60e5-4dfc-ba97-6721cee5ca9e req-7dd41a71-9fa3-41ec-994c-ed56f7d32e70 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.531 182729 DEBUG oslo_concurrency.lockutils [req-e4ceffb1-60e5-4dfc-ba97-6721cee5ca9e req-7dd41a71-9fa3-41ec-994c-ed56f7d32e70 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.531 182729 DEBUG nova.compute.manager [req-e4ceffb1-60e5-4dfc-ba97-6721cee5ca9e req-7dd41a71-9fa3-41ec-994c-ed56f7d32e70 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.531 182729 DEBUG nova.compute.manager [req-e4ceffb1-60e5-4dfc-ba97-6721cee5ca9e req-7dd41a71-9fa3-41ec-994c-ed56f7d32e70 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.533 182729 INFO nova.virt.libvirt.driver [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Instance destroyed successfully.
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.534 182729 DEBUG nova.objects.instance [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lazy-loading 'resources' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.545 182729 DEBUG nova.virt.libvirt.vif [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:17:19Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.545 182729 DEBUG nova.network.os_vif_util [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.546 182729 DEBUG nova.network.os_vif_util [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.546 182729 DEBUG os_vif [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.548 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.548 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3cbb0272-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.550 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.551 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.554 182729 INFO os_vif [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18')
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.555 182729 INFO nova.virt.libvirt.driver [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Deleting instance files /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b_del
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.555 182729 INFO nova.virt.libvirt.driver [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Deletion of /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b_del complete
Jan 22 22:18:13 compute-0 podman[213351]: 2026-01-22 22:18:13.58912748 +0000 UTC m=+0.044160852 container remove bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.596 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a7ac8a9d-e80b-46f1-82b1-7020b5d27e66]: (4, ('Thu Jan 22 10:18:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08)\nbf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08\nThu Jan 22 10:18:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (bf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08)\nbf21015051127f2b8a229a33b47792b7972b2161faba5d561cf4987bd5f71a08\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.598 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a2178de5-27ba-47f1-b77f-4233959c2807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.599 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.601 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 kernel: tap0265f228-40: left promiscuous mode
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.617 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.619 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[131c1f34-247b-4cbd-8105-f838a562827a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.620 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.628 182729 INFO nova.compute.manager [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.629 182729 DEBUG oslo.service.loopingcall [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.629 182729 DEBUG nova.compute.manager [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:18:13 compute-0 nova_compute[182725]: 2026-01-22 22:18:13.630 182729 DEBUG nova.network.neutron [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.637 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a9abc253-4823-43d6-9108-74fa0262cf33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.638 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3b237bea-9865-46d5-a84c-eaee2865bfce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.661 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5063d9-981c-4784-8eab-db8eac3c8aad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387255, 'reachable_time': 43776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213363, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:13 compute-0 systemd[1]: run-netns-ovnmeta\x2d0265f228\x2d4e11\x2d4f15\x2d8d77\x2d6acb409f3f7b.mount: Deactivated successfully.
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.667 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:18:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:13.669 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[c37e396d-77a2-47cc-8109-3b2b09ba305f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:14 compute-0 nova_compute[182725]: 2026-01-22 22:18:14.838 182729 DEBUG nova.network.neutron [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:18:14 compute-0 nova_compute[182725]: 2026-01-22 22:18:14.863 182729 INFO nova.compute.manager [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Took 1.23 seconds to deallocate network for instance.
Jan 22 22:18:14 compute-0 nova_compute[182725]: 2026-01-22 22:18:14.988 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:14 compute-0 nova_compute[182725]: 2026-01-22 22:18:14.989 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.065 182729 DEBUG nova.compute.provider_tree [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.085 182729 DEBUG nova.scheduler.client.report [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.113 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.130 182729 DEBUG nova.compute.manager [req-225f9609-cfaa-4143-81cf-abdd07e04ad0 req-8e40cf3e-7177-49bb-b029-c7d080ecd8aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-deleted-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.141 182729 INFO nova.scheduler.client.report [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Deleted allocations for instance 1c2458ea-22d6-480f-ae75-5f050eb08b2b
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.220 182729 DEBUG oslo_concurrency.lockutils [None req-f5dab329-9496-4a08-b0ad-3152d3d570a8 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:15.652 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.653 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:15.655 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.676 182729 DEBUG nova.compute.manager [req-52842fde-f39d-4939-9312-34c0eeed580b req-fe0d7d6c-9e26-4ff9-b4c8-8228c37c1d19 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.677 182729 DEBUG oslo_concurrency.lockutils [req-52842fde-f39d-4939-9312-34c0eeed580b req-fe0d7d6c-9e26-4ff9-b4c8-8228c37c1d19 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.677 182729 DEBUG oslo_concurrency.lockutils [req-52842fde-f39d-4939-9312-34c0eeed580b req-fe0d7d6c-9e26-4ff9-b4c8-8228c37c1d19 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.678 182729 DEBUG oslo_concurrency.lockutils [req-52842fde-f39d-4939-9312-34c0eeed580b req-fe0d7d6c-9e26-4ff9-b4c8-8228c37c1d19 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.678 182729 DEBUG nova.compute.manager [req-52842fde-f39d-4939-9312-34c0eeed580b req-fe0d7d6c-9e26-4ff9-b4c8-8228c37c1d19 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:18:15 compute-0 nova_compute[182725]: 2026-01-22 22:18:15.679 182729 WARNING nova.compute.manager [req-52842fde-f39d-4939-9312-34c0eeed580b req-fe0d7d6c-9e26-4ff9-b4c8-8228c37c1d19 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state deleted and task_state None.
Jan 22 22:18:18 compute-0 nova_compute[182725]: 2026-01-22 22:18:18.502 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:18 compute-0 nova_compute[182725]: 2026-01-22 22:18:18.551 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:20 compute-0 nova_compute[182725]: 2026-01-22 22:18:20.179 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:21 compute-0 podman[213364]: 2026-01-22 22:18:21.161585178 +0000 UTC m=+0.082158217 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 22:18:21 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:21.658 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:23 compute-0 nova_compute[182725]: 2026-01-22 22:18:23.505 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:23 compute-0 nova_compute[182725]: 2026-01-22 22:18:23.553 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:25 compute-0 podman[213387]: 2026-01-22 22:18:25.170837036 +0000 UTC m=+0.089894762 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 22 22:18:25 compute-0 podman[213386]: 2026-01-22 22:18:25.191162377 +0000 UTC m=+0.115136757 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:18:28 compute-0 nova_compute[182725]: 2026-01-22 22:18:28.507 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:28 compute-0 nova_compute[182725]: 2026-01-22 22:18:28.526 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120293.5252242, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:28 compute-0 nova_compute[182725]: 2026-01-22 22:18:28.526 182729 INFO nova.compute.manager [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Stopped (Lifecycle Event)
Jan 22 22:18:28 compute-0 nova_compute[182725]: 2026-01-22 22:18:28.554 182729 DEBUG nova.compute.manager [None req-55177b8e-cdd6-4c49-ac32-2df90dc36349 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:28 compute-0 nova_compute[182725]: 2026-01-22 22:18:28.555 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:33 compute-0 nova_compute[182725]: 2026-01-22 22:18:33.510 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:33 compute-0 nova_compute[182725]: 2026-01-22 22:18:33.557 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:34 compute-0 nova_compute[182725]: 2026-01-22 22:18:34.822 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:34 compute-0 nova_compute[182725]: 2026-01-22 22:18:34.823 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:34 compute-0 nova_compute[182725]: 2026-01-22 22:18:34.843 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:18:34 compute-0 nova_compute[182725]: 2026-01-22 22:18:34.974 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:34 compute-0 nova_compute[182725]: 2026-01-22 22:18:34.975 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:34 compute-0 nova_compute[182725]: 2026-01-22 22:18:34.986 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:18:34 compute-0 nova_compute[182725]: 2026-01-22 22:18:34.986 182729 INFO nova.compute.claims [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.152 182729 DEBUG nova.compute.provider_tree [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:18:35 compute-0 podman[213433]: 2026-01-22 22:18:35.159655794 +0000 UTC m=+0.084576098 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.168 182729 DEBUG nova.scheduler.client.report [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:18:35 compute-0 podman[213434]: 2026-01-22 22:18:35.16863995 +0000 UTC m=+0.092668522 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.194 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.195 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.256 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.257 182729 DEBUG nova.network.neutron [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.284 182729 INFO nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.303 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.412 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.414 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.415 182729 INFO nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Creating image(s)
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.415 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "/var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.416 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.416 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.431 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.529 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.531 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.531 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.542 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.603 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.604 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.698 182729 DEBUG nova.policy [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f23ea0c335b84bd2b78725d5a5491d0a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '214876cdc63543458d35ee214fe21b2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.701 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk 1073741824" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.703 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.704 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.798 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.800 182729 DEBUG nova.virt.disk.api [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Checking if we can resize image /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.800 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.872 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.873 182729 DEBUG nova.virt.disk.api [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Cannot resize image /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.874 182729 DEBUG nova.objects.instance [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'migration_context' on Instance uuid 008af030-d785-4936-871a-4d52ccebc8f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.892 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.893 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Ensure instance console log exists: /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.893 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.894 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:35 compute-0 nova_compute[182725]: 2026-01-22 22:18:35.895 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:36 compute-0 nova_compute[182725]: 2026-01-22 22:18:36.479 182729 DEBUG nova.network.neutron [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Successfully created port: 09a74418-977b-4aa6-86a6-2d84a3cb143a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.234 182729 DEBUG nova.network.neutron [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Successfully updated port: 09a74418-977b-4aa6-86a6-2d84a3cb143a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.251 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "refresh_cache-008af030-d785-4936-871a-4d52ccebc8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.252 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquired lock "refresh_cache-008af030-d785-4936-871a-4d52ccebc8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.252 182729 DEBUG nova.network.neutron [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.332 182729 DEBUG nova.compute.manager [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received event network-changed-09a74418-977b-4aa6-86a6-2d84a3cb143a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.333 182729 DEBUG nova.compute.manager [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Refreshing instance network info cache due to event network-changed-09a74418-977b-4aa6-86a6-2d84a3cb143a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.333 182729 DEBUG oslo_concurrency.lockutils [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-008af030-d785-4936-871a-4d52ccebc8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:18:37 compute-0 nova_compute[182725]: 2026-01-22 22:18:37.613 182729 DEBUG nova.network.neutron [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.512 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.559 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.704 182729 DEBUG nova.network.neutron [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Updating instance_info_cache with network_info: [{"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.733 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Releasing lock "refresh_cache-008af030-d785-4936-871a-4d52ccebc8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.734 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Instance network_info: |[{"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.736 182729 DEBUG oslo_concurrency.lockutils [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-008af030-d785-4936-871a-4d52ccebc8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.737 182729 DEBUG nova.network.neutron [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Refreshing network info cache for port 09a74418-977b-4aa6-86a6-2d84a3cb143a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.742 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Start _get_guest_xml network_info=[{"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.749 182729 WARNING nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.758 182729 DEBUG nova.virt.libvirt.host [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.759 182729 DEBUG nova.virt.libvirt.host [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.770 182729 DEBUG nova.virt.libvirt.host [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.771 182729 DEBUG nova.virt.libvirt.host [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.773 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.773 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.774 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.774 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.775 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.775 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.776 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.776 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.777 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.777 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.778 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.778 182729 DEBUG nova.virt.hardware [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.784 182729 DEBUG nova.virt.libvirt.vif [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1904910074',display_name='tempest-ServersAdminTestJSON-server-1904910074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1904910074',id=19,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-tyub60qk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:35Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=008af030-d785-4936-871a-4d52ccebc8f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.785 182729 DEBUG nova.network.os_vif_util [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.787 182729 DEBUG nova.network.os_vif_util [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:3d:22,bridge_name='br-int',has_traffic_filtering=True,id=09a74418-977b-4aa6-86a6-2d84a3cb143a,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a74418-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.788 182729 DEBUG nova.objects.instance [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'pci_devices' on Instance uuid 008af030-d785-4936-871a-4d52ccebc8f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.806 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <uuid>008af030-d785-4936-871a-4d52ccebc8f8</uuid>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <name>instance-00000013</name>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersAdminTestJSON-server-1904910074</nova:name>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:18:38</nova:creationTime>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:user uuid="f23ea0c335b84bd2b78725d5a5491d0a">tempest-ServersAdminTestJSON-1825362070-project-member</nova:user>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:project uuid="214876cdc63543458d35ee214fe21b2c">tempest-ServersAdminTestJSON-1825362070</nova:project>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         <nova:port uuid="09a74418-977b-4aa6-86a6-2d84a3cb143a">
Jan 22 22:18:38 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <system>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <entry name="serial">008af030-d785-4936-871a-4d52ccebc8f8</entry>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <entry name="uuid">008af030-d785-4936-871a-4d52ccebc8f8</entry>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </system>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <os>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   </os>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <features>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   </features>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk.config"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:59:3d:22"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <target dev="tap09a74418-97"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/console.log" append="off"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <video>
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </video>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:18:38 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:18:38 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:18:38 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:18:38 compute-0 nova_compute[182725]: </domain>
Jan 22 22:18:38 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.808 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Preparing to wait for external event network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.809 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.809 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.810 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.811 182729 DEBUG nova.virt.libvirt.vif [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1904910074',display_name='tempest-ServersAdminTestJSON-server-1904910074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1904910074',id=19,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-tyub60qk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:35Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=008af030-d785-4936-871a-4d52ccebc8f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.811 182729 DEBUG nova.network.os_vif_util [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.812 182729 DEBUG nova.network.os_vif_util [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:3d:22,bridge_name='br-int',has_traffic_filtering=True,id=09a74418-977b-4aa6-86a6-2d84a3cb143a,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a74418-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.813 182729 DEBUG os_vif [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:3d:22,bridge_name='br-int',has_traffic_filtering=True,id=09a74418-977b-4aa6-86a6-2d84a3cb143a,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a74418-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.814 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.815 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.815 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.819 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.819 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09a74418-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.820 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09a74418-97, col_values=(('external_ids', {'iface-id': '09a74418-977b-4aa6-86a6-2d84a3cb143a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:3d:22', 'vm-uuid': '008af030-d785-4936-871a-4d52ccebc8f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.822 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:38 compute-0 NetworkManager[54954]: <info>  [1769120318.8250] manager: (tap09a74418-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.825 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.830 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.831 182729 INFO os_vif [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:3d:22,bridge_name='br-int',has_traffic_filtering=True,id=09a74418-977b-4aa6-86a6-2d84a3cb143a,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a74418-97')
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.906 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.908 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.909 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No VIF found with MAC fa:16:3e:59:3d:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:18:38 compute-0 nova_compute[182725]: 2026-01-22 22:18:38.910 182729 INFO nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Using config drive
Jan 22 22:18:39 compute-0 nova_compute[182725]: 2026-01-22 22:18:39.520 182729 INFO nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Creating config drive at /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk.config
Jan 22 22:18:39 compute-0 nova_compute[182725]: 2026-01-22 22:18:39.530 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe3bx74zv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:39 compute-0 nova_compute[182725]: 2026-01-22 22:18:39.670 182729 DEBUG oslo_concurrency.processutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe3bx74zv" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:39 compute-0 kernel: tap09a74418-97: entered promiscuous mode
Jan 22 22:18:39 compute-0 NetworkManager[54954]: <info>  [1769120319.7846] manager: (tap09a74418-97): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 22 22:18:39 compute-0 ovn_controller[94850]: 2026-01-22T22:18:39Z|00069|binding|INFO|Claiming lport 09a74418-977b-4aa6-86a6-2d84a3cb143a for this chassis.
Jan 22 22:18:39 compute-0 ovn_controller[94850]: 2026-01-22T22:18:39Z|00070|binding|INFO|09a74418-977b-4aa6-86a6-2d84a3cb143a: Claiming fa:16:3e:59:3d:22 10.100.0.5
Jan 22 22:18:39 compute-0 nova_compute[182725]: 2026-01-22 22:18:39.789 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:39 compute-0 nova_compute[182725]: 2026-01-22 22:18:39.794 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:39 compute-0 nova_compute[182725]: 2026-01-22 22:18:39.800 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:39 compute-0 systemd-udevd[213520]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:18:39 compute-0 systemd-machined[154006]: New machine qemu-7-instance-00000013.
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.839 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:3d:22 10.100.0.5'], port_security=['fa:16:3e:59:3d:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '008af030-d785-4936-871a-4d52ccebc8f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=09a74418-977b-4aa6-86a6-2d84a3cb143a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.841 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 09a74418-977b-4aa6-86a6-2d84a3cb143a in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c bound to our chassis
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.842 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c
Jan 22 22:18:39 compute-0 NetworkManager[54954]: <info>  [1769120319.8431] device (tap09a74418-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:18:39 compute-0 NetworkManager[54954]: <info>  [1769120319.8440] device (tap09a74418-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.855 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82e202c5-f1a1-4974-a02c-571cf609e9ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.856 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19dd816f-61 in ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.859 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19dd816f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.860 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3f95d59e-68e9-4d07-a8d5-9ce32e3d824e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.861 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d97f1183-2016-4994-b057-37d9971233f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:39 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000013.
Jan 22 22:18:39 compute-0 ovn_controller[94850]: 2026-01-22T22:18:39Z|00071|binding|INFO|Setting lport 09a74418-977b-4aa6-86a6-2d84a3cb143a ovn-installed in OVS
Jan 22 22:18:39 compute-0 ovn_controller[94850]: 2026-01-22T22:18:39Z|00072|binding|INFO|Setting lport 09a74418-977b-4aa6-86a6-2d84a3cb143a up in Southbound
Jan 22 22:18:39 compute-0 nova_compute[182725]: 2026-01-22 22:18:39.874 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.874 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0c360b-9b86-4706-853b-3f459d850211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:39 compute-0 podman[213503]: 2026-01-22 22:18:39.877698717 +0000 UTC m=+0.104986432 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.908 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cf05f101-e203-4f30-8be3-fee54a4406b1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.945 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[74df9f70-48d0-4954-b4cb-911f83f4fa19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.951 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bf9fb3-86bf-4183-bc5c-d213da4c76b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:39 compute-0 NetworkManager[54954]: <info>  [1769120319.9534] manager: (tap19dd816f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 22 22:18:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.994 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ef7e92-b6fd-447b-a5d5-fb609c061dcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:39.998 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5e6958-b092-48f4-9da2-b0205e206626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 NetworkManager[54954]: <info>  [1769120320.0292] device (tap19dd816f-60): carrier: link connected
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.038 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3c4765-98db-42fa-b76c-941d4bab8da4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.063 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[922009ee-ccb4-4e13-91b8-2d5a49d9725d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395763, 'reachable_time': 31821, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213564, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.082 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f5864696-8e17-4d78-a43b-e83c09197d7c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:7247'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395763, 'tstamp': 395763}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213565, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.108 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9b2141-2a8d-445d-966f-165da46aa4d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395763, 'reachable_time': 31821, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213566, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.523 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[26ff8e56-a78a-4028-8769-4d0e058e4509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.527 182729 DEBUG nova.network.neutron [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Updated VIF entry in instance network info cache for port 09a74418-977b-4aa6-86a6-2d84a3cb143a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.527 182729 DEBUG nova.network.neutron [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Updating instance_info_cache with network_info: [{"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.530 182729 DEBUG nova.compute.manager [req-b18d855d-551b-4f09-89bd-b66bb56768a9 req-7ccfa062-f5bd-4fa3-8e03-764642dfe073 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received event network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.530 182729 DEBUG oslo_concurrency.lockutils [req-b18d855d-551b-4f09-89bd-b66bb56768a9 req-7ccfa062-f5bd-4fa3-8e03-764642dfe073 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.530 182729 DEBUG oslo_concurrency.lockutils [req-b18d855d-551b-4f09-89bd-b66bb56768a9 req-7ccfa062-f5bd-4fa3-8e03-764642dfe073 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.531 182729 DEBUG oslo_concurrency.lockutils [req-b18d855d-551b-4f09-89bd-b66bb56768a9 req-7ccfa062-f5bd-4fa3-8e03-764642dfe073 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.531 182729 DEBUG nova.compute.manager [req-b18d855d-551b-4f09-89bd-b66bb56768a9 req-7ccfa062-f5bd-4fa3-8e03-764642dfe073 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Processing event network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.559 182729 DEBUG oslo_concurrency.lockutils [req-50c59fcc-8282-4876-9c6c-a6acc1e76647 req-83cf1b26-b97d-4b60-8263-657c606439ee 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-008af030-d785-4936-871a-4d52ccebc8f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.606 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fde0f742-f559-42b5-be36-4e3c3b48dbc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.608 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.608 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.608 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.610 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:40 compute-0 kernel: tap19dd816f-60: entered promiscuous mode
Jan 22 22:18:40 compute-0 NetworkManager[54954]: <info>  [1769120320.6116] manager: (tap19dd816f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.620 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:40 compute-0 ovn_controller[94850]: 2026-01-22T22:18:40Z|00073|binding|INFO|Releasing lport 32bed344-462e-4b45-8eb9-1fd48f73f73c from this chassis (sb_readonly=0)
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.620 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.627 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.629 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b6156a34-b4f5-4c92-8b18-bfb36dc8ec5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.630 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-19dd816f-669a-4bda-b508-a3ddcd4c2d7c
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.pid.haproxy
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 19dd816f-669a-4bda-b508-a3ddcd4c2d7c
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:18:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:40.632 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'env', 'PROCESS_TAG=haproxy-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.632 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.688 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120320.6873198, 008af030-d785-4936-871a-4d52ccebc8f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.688 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] VM Started (Lifecycle Event)
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.694 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.700 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.705 182729 INFO nova.virt.libvirt.driver [-] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Instance spawned successfully.
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.705 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.710 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.715 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.729 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.730 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.731 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.731 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.732 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.733 182729 DEBUG nova.virt.libvirt.driver [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.740 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.740 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120320.6876464, 008af030-d785-4936-871a-4d52ccebc8f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.741 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] VM Paused (Lifecycle Event)
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.777 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.782 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120320.6986704, 008af030-d785-4936-871a-4d52ccebc8f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.782 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] VM Resumed (Lifecycle Event)
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.811 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.816 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.833 182729 INFO nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Took 5.42 seconds to spawn the instance on the hypervisor.
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.834 182729 DEBUG nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.843 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.917 182729 INFO nova.compute.manager [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Took 6.02 seconds to build instance.
Jan 22 22:18:40 compute-0 nova_compute[182725]: 2026-01-22 22:18:40.973 182729 DEBUG oslo_concurrency.lockutils [None req-893b8c46-2e64-4497-b242-7196fab338fc f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:41 compute-0 podman[213602]: 2026-01-22 22:18:41.121314832 +0000 UTC m=+0.078707841 container create 53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:18:41 compute-0 systemd[1]: Started libpod-conmon-53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04.scope.
Jan 22 22:18:41 compute-0 podman[213602]: 2026-01-22 22:18:41.086323372 +0000 UTC m=+0.043716411 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:18:41 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:18:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc8012d210d5001d0b8e47f8cfa4ccf4fd6427e37b92900a149199b63c337505/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:18:41 compute-0 podman[213602]: 2026-01-22 22:18:41.231730188 +0000 UTC m=+0.189123227 container init 53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 22:18:41 compute-0 podman[213602]: 2026-01-22 22:18:41.244308485 +0000 UTC m=+0.201701494 container start 53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:18:41 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [NOTICE]   (213623) : New worker (213625) forked
Jan 22 22:18:41 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [NOTICE]   (213623) : Loading success.
Jan 22 22:18:42 compute-0 nova_compute[182725]: 2026-01-22 22:18:42.407 182729 DEBUG nova.compute.manager [req-a30597c5-144d-4e90-a1fa-f4787ab1ad60 req-1c96cda1-622a-4f6b-8ea6-887d2edbebd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received event network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:42 compute-0 nova_compute[182725]: 2026-01-22 22:18:42.408 182729 DEBUG oslo_concurrency.lockutils [req-a30597c5-144d-4e90-a1fa-f4787ab1ad60 req-1c96cda1-622a-4f6b-8ea6-887d2edbebd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:42 compute-0 nova_compute[182725]: 2026-01-22 22:18:42.408 182729 DEBUG oslo_concurrency.lockutils [req-a30597c5-144d-4e90-a1fa-f4787ab1ad60 req-1c96cda1-622a-4f6b-8ea6-887d2edbebd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:42 compute-0 nova_compute[182725]: 2026-01-22 22:18:42.409 182729 DEBUG oslo_concurrency.lockutils [req-a30597c5-144d-4e90-a1fa-f4787ab1ad60 req-1c96cda1-622a-4f6b-8ea6-887d2edbebd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:42 compute-0 nova_compute[182725]: 2026-01-22 22:18:42.409 182729 DEBUG nova.compute.manager [req-a30597c5-144d-4e90-a1fa-f4787ab1ad60 req-1c96cda1-622a-4f6b-8ea6-887d2edbebd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] No waiting events found dispatching network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:18:42 compute-0 nova_compute[182725]: 2026-01-22 22:18:42.409 182729 WARNING nova.compute.manager [req-a30597c5-144d-4e90-a1fa-f4787ab1ad60 req-1c96cda1-622a-4f6b-8ea6-887d2edbebd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received unexpected event network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a for instance with vm_state active and task_state None.
Jan 22 22:18:43 compute-0 nova_compute[182725]: 2026-01-22 22:18:43.514 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:43 compute-0 nova_compute[182725]: 2026-01-22 22:18:43.823 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:48 compute-0 nova_compute[182725]: 2026-01-22 22:18:48.516 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:48 compute-0 nova_compute[182725]: 2026-01-22 22:18:48.825 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.061 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.062 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.085 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.262 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.262 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.273 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.274 182729 INFO nova.compute.claims [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.432 182729 DEBUG nova.compute.provider_tree [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.448 182729 DEBUG nova.scheduler.client.report [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.470 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.472 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.628 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.629 182729 DEBUG nova.network.neutron [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.658 182729 INFO nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.696 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.872 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.874 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.874 182729 INFO nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating image(s)
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.875 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.875 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.876 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.893 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.985 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.986 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:49 compute-0 nova_compute[182725]: 2026-01-22 22:18:49.987 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.001 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.087 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.088 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.120 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.121 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.122 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.177 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.180 182729 DEBUG nova.virt.disk.api [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Checking if we can resize image /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.181 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.237 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.239 182729 DEBUG nova.virt.disk.api [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Cannot resize image /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.239 182729 DEBUG nova.objects.instance [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lazy-loading 'migration_context' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.260 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.261 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Ensure instance console log exists: /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.262 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.262 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.263 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:50 compute-0 nova_compute[182725]: 2026-01-22 22:18:50.742 182729 DEBUG nova.policy [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:18:52 compute-0 podman[213664]: 2026-01-22 22:18:52.175475964 +0000 UTC m=+0.095095834 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:18:53 compute-0 nova_compute[182725]: 2026-01-22 22:18:53.351 182729 DEBUG nova.network.neutron [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Successfully created port: 580dc508-636a-420e-aed2-8efd9dccace5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:18:53 compute-0 nova_compute[182725]: 2026-01-22 22:18:53.519 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:53 compute-0 nova_compute[182725]: 2026-01-22 22:18:53.828 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:54 compute-0 ovn_controller[94850]: 2026-01-22T22:18:54Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:59:3d:22 10.100.0.5
Jan 22 22:18:54 compute-0 ovn_controller[94850]: 2026-01-22T22:18:54Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:3d:22 10.100.0.5
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.377 182729 DEBUG nova.network.neutron [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Successfully updated port: 580dc508-636a-420e-aed2-8efd9dccace5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.407 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.407 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.407 182729 DEBUG nova.network.neutron [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.638 182729 DEBUG nova.network.neutron [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.692 182729 DEBUG nova.compute.manager [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-changed-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.692 182729 DEBUG nova.compute.manager [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Refreshing instance network info cache due to event network-changed-580dc508-636a-420e-aed2-8efd9dccace5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:18:55 compute-0 nova_compute[182725]: 2026-01-22 22:18:55.693 182729 DEBUG oslo_concurrency.lockutils [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:18:56 compute-0 podman[213686]: 2026-01-22 22:18:56.14860287 +0000 UTC m=+0.068699520 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc.)
Jan 22 22:18:56 compute-0 podman[213685]: 2026-01-22 22:18:56.172926812 +0000 UTC m=+0.099672249 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.835 182729 DEBUG nova.network.neutron [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.863 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.864 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Instance network_info: |[{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.864 182729 DEBUG oslo_concurrency.lockutils [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.865 182729 DEBUG nova.network.neutron [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Refreshing network info cache for port 580dc508-636a-420e-aed2-8efd9dccace5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.870 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Start _get_guest_xml network_info=[{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.877 182729 WARNING nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.883 182729 DEBUG nova.virt.libvirt.host [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.884 182729 DEBUG nova.virt.libvirt.host [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.893 182729 DEBUG nova.virt.libvirt.host [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.894 182729 DEBUG nova.virt.libvirt.host [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.895 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.895 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.895 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.896 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.896 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.896 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.896 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.896 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.897 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.897 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.897 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.897 182729 DEBUG nova.virt.hardware [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.901 182729 DEBUG nova.virt.libvirt.vif [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:49Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.901 182729 DEBUG nova.network.os_vif_util [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.902 182729 DEBUG nova.network.os_vif_util [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.903 182729 DEBUG nova.objects.instance [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lazy-loading 'pci_devices' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.918 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <uuid>469eaf2b-7d53-40c9-a233-b27d702a21ed</uuid>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <name>instance-00000016</name>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <nova:name>tempest-LiveMigrationTest-server-55126447</nova:name>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:18:56</nova:creationTime>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:user uuid="06b4b3807dc64d83b8bfbbf0c4d31d77">tempest-LiveMigrationTest-652633664-project-member</nova:user>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:project uuid="9ead4241c55147dcbe136a6d6a69a60f">tempest-LiveMigrationTest-652633664</nova:project>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         <nova:port uuid="580dc508-636a-420e-aed2-8efd9dccace5">
Jan 22 22:18:56 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <system>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <entry name="serial">469eaf2b-7d53-40c9-a233-b27d702a21ed</entry>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <entry name="uuid">469eaf2b-7d53-40c9-a233-b27d702a21ed</entry>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </system>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <os>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   </os>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <features>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   </features>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:01:a3:f5"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <target dev="tap580dc508-63"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/console.log" append="off"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <video>
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </video>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:18:56 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:18:56 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:18:56 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:18:56 compute-0 nova_compute[182725]: </domain>
Jan 22 22:18:56 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.919 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Preparing to wait for external event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.920 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.920 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.921 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.922 182729 DEBUG nova.virt.libvirt.vif [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:49Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.922 182729 DEBUG nova.network.os_vif_util [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.924 182729 DEBUG nova.network.os_vif_util [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.924 182729 DEBUG os_vif [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.925 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.926 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.927 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.933 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap580dc508-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.934 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap580dc508-63, col_values=(('external_ids', {'iface-id': '580dc508-636a-420e-aed2-8efd9dccace5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:a3:f5', 'vm-uuid': '469eaf2b-7d53-40c9-a233-b27d702a21ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.936 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:56 compute-0 NetworkManager[54954]: <info>  [1769120336.9383] manager: (tap580dc508-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.941 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.948 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:56 compute-0 nova_compute[182725]: 2026-01-22 22:18:56.949 182729 INFO os_vif [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63')
Jan 22 22:18:57 compute-0 nova_compute[182725]: 2026-01-22 22:18:57.020 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:18:57 compute-0 nova_compute[182725]: 2026-01-22 22:18:57.021 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:18:57 compute-0 nova_compute[182725]: 2026-01-22 22:18:57.021 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] No VIF found with MAC fa:16:3e:01:a3:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:18:57 compute-0 nova_compute[182725]: 2026-01-22 22:18:57.021 182729 INFO nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Using config drive
Jan 22 22:18:57 compute-0 nova_compute[182725]: 2026-01-22 22:18:57.947 182729 INFO nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating config drive at /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config
Jan 22 22:18:57 compute-0 nova_compute[182725]: 2026-01-22 22:18:57.955 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprht_f_1v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.095 182729 DEBUG oslo_concurrency.processutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprht_f_1v" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:18:58 compute-0 kernel: tap580dc508-63: entered promiscuous mode
Jan 22 22:18:58 compute-0 NetworkManager[54954]: <info>  [1769120338.1843] manager: (tap580dc508-63): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 22 22:18:58 compute-0 ovn_controller[94850]: 2026-01-22T22:18:58Z|00074|binding|INFO|Claiming lport 580dc508-636a-420e-aed2-8efd9dccace5 for this chassis.
Jan 22 22:18:58 compute-0 ovn_controller[94850]: 2026-01-22T22:18:58Z|00075|binding|INFO|580dc508-636a-420e-aed2-8efd9dccace5: Claiming fa:16:3e:01:a3:f5 10.100.0.6
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.185 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.211 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:a3:f5 10.100.0.6'], port_security=['fa:16:3e:01:a3:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=580dc508-636a-420e-aed2-8efd9dccace5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.213 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 580dc508-636a-420e-aed2-8efd9dccace5 in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea bound to our chassis
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.215 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.228 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4d32cf44-8d2a-47a2-a532-c8a636e645e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.229 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap698e77c5-f1 in ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.232 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap698e77c5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.232 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7d09a3a5-1c0a-4f43-bf34-6e50ccca1a33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.233 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e59e444c-59be-4d1d-8cfc-3777aff84371]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 systemd-udevd[213749]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:18:58 compute-0 systemd-machined[154006]: New machine qemu-8-instance-00000016.
Jan 22 22:18:58 compute-0 NetworkManager[54954]: <info>  [1769120338.2690] device (tap580dc508-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:18:58 compute-0 NetworkManager[54954]: <info>  [1769120338.2704] device (tap580dc508-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.258 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[8485490f-2037-4675-b7f3-e0b643efd5e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.274 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 ovn_controller[94850]: 2026-01-22T22:18:58Z|00076|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 ovn-installed in OVS
Jan 22 22:18:58 compute-0 ovn_controller[94850]: 2026-01-22T22:18:58Z|00077|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 up in Southbound
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.279 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000016.
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.288 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cf30ffaa-2f8c-4fbe-a9e0-0f02a1e1725a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.323 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[44a48c66-c6ab-4cf2-b7b5-ca9d1dd4b2de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 NetworkManager[54954]: <info>  [1769120338.3305] manager: (tap698e77c5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.328 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f12cdc89-2d07-43bd-8529-5347cd34509f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 systemd-udevd[213755]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.364 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[2bbc2feb-8bc4-449b-8901-0ad4fa292fde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.367 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c30664bf-f954-4a7c-abaa-54e7368beb3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 NetworkManager[54954]: <info>  [1769120338.3951] device (tap698e77c5-f0): carrier: link connected
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.403 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4a86b38b-a008-4467-b686-89d1e2f455d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.423 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad83d80-3b65-40e6-bbef-5110b62cd64c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397599, 'reachable_time': 26321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213785, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.442 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ac5de5d3-5400-48d1-9263-2b0b63409461]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:3733'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397599, 'tstamp': 397599}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213786, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.455 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[620c121f-b1dd-4a49-9da5-0f4ab055d3f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397599, 'reachable_time': 26321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213787, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.500 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[29afb93f-604f-4536-8d77-8fbadebbdecd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.522 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.581 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c29a91-a334-479d-a38a-6e9c6d14f476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.583 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.583 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.584 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap698e77c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.586 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 NetworkManager[54954]: <info>  [1769120338.5866] manager: (tap698e77c5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 22 22:18:58 compute-0 kernel: tap698e77c5-f0: entered promiscuous mode
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.589 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.592 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap698e77c5-f0, col_values=(('external_ids', {'iface-id': 'a18a2be2-f1a5-44ce-96ac-2c546dab3eef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.593 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 ovn_controller[94850]: 2026-01-22T22:18:58Z|00078|binding|INFO|Releasing lport a18a2be2-f1a5-44ce-96ac-2c546dab3eef from this chassis (sb_readonly=0)
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.617 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.620 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.622 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f73e9f9b-4dd3-4fd3-a793-601032e68d4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.623 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:18:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:18:58.624 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'env', 'PROCESS_TAG=haproxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/698e77c5-fce6-47a5-b6e3-f4c56da226ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.705 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120338.7050285, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.706 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Started (Lifecycle Event)
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.762 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.777 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120338.706007, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.778 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Paused (Lifecycle Event)
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.810 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.816 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:18:58 compute-0 nova_compute[182725]: 2026-01-22 22:18:58.841 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:18:59 compute-0 podman[213826]: 2026-01-22 22:18:59.052520759 +0000 UTC m=+0.078355603 container create 9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 22:18:59 compute-0 systemd[1]: Started libpod-conmon-9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126.scope.
Jan 22 22:18:59 compute-0 podman[213826]: 2026-01-22 22:18:59.013911317 +0000 UTC m=+0.039746211 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:18:59 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:18:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/178d76b2f687d89584b1584a0551919b2dbbe85d19aa54e20028a78942fe2b97/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:18:59 compute-0 podman[213826]: 2026-01-22 22:18:59.159669165 +0000 UTC m=+0.185504069 container init 9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 22:18:59 compute-0 podman[213826]: 2026-01-22 22:18:59.170600651 +0000 UTC m=+0.196435495 container start 9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:18:59 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[213841]: [NOTICE]   (213845) : New worker (213847) forked
Jan 22 22:18:59 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[213841]: [NOTICE]   (213845) : Loading success.
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.355 182729 DEBUG nova.compute.manager [req-8413c5c0-eb8f-4d13-9b75-62ca363f4779 req-ea934d97-3ddf-487e-83fc-3380419ae44c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.356 182729 DEBUG oslo_concurrency.lockutils [req-8413c5c0-eb8f-4d13-9b75-62ca363f4779 req-ea934d97-3ddf-487e-83fc-3380419ae44c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.356 182729 DEBUG oslo_concurrency.lockutils [req-8413c5c0-eb8f-4d13-9b75-62ca363f4779 req-ea934d97-3ddf-487e-83fc-3380419ae44c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.357 182729 DEBUG oslo_concurrency.lockutils [req-8413c5c0-eb8f-4d13-9b75-62ca363f4779 req-ea934d97-3ddf-487e-83fc-3380419ae44c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.358 182729 DEBUG nova.compute.manager [req-8413c5c0-eb8f-4d13-9b75-62ca363f4779 req-ea934d97-3ddf-487e-83fc-3380419ae44c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Processing event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.359 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.365 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120339.364989, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.365 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Resumed (Lifecycle Event)
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.368 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.373 182729 INFO nova.virt.libvirt.driver [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Instance spawned successfully.
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.373 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.390 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.399 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.404 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.404 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.405 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.405 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.406 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.406 182729 DEBUG nova.virt.libvirt.driver [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.437 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.508 182729 INFO nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Took 9.64 seconds to spawn the instance on the hypervisor.
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.509 182729 DEBUG nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.629 182729 INFO nova.compute.manager [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Took 10.42 seconds to build instance.
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.647 182729 DEBUG oslo_concurrency.lockutils [None req-3f204bca-b979-438f-961e-bdfd93fc0067 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.690 182729 DEBUG nova.network.neutron [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updated VIF entry in instance network info cache for port 580dc508-636a-420e-aed2-8efd9dccace5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.691 182729 DEBUG nova.network.neutron [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:18:59 compute-0 nova_compute[182725]: 2026-01-22 22:18:59.713 182729 DEBUG oslo_concurrency.lockutils [req-506fbd28-8309-470d-a94c-75d5c39ea19b req-ed18ee07-6fa8-469f-9967-ee6827648fc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:01 compute-0 nova_compute[182725]: 2026-01-22 22:19:01.546 182729 DEBUG nova.compute.manager [req-7e323d2f-b83f-4121-bad9-56704dcd38ef req-09e83221-0f84-40a9-b0fc-7279add1f0ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:01 compute-0 nova_compute[182725]: 2026-01-22 22:19:01.546 182729 DEBUG oslo_concurrency.lockutils [req-7e323d2f-b83f-4121-bad9-56704dcd38ef req-09e83221-0f84-40a9-b0fc-7279add1f0ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:01 compute-0 nova_compute[182725]: 2026-01-22 22:19:01.547 182729 DEBUG oslo_concurrency.lockutils [req-7e323d2f-b83f-4121-bad9-56704dcd38ef req-09e83221-0f84-40a9-b0fc-7279add1f0ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:01 compute-0 nova_compute[182725]: 2026-01-22 22:19:01.547 182729 DEBUG oslo_concurrency.lockutils [req-7e323d2f-b83f-4121-bad9-56704dcd38ef req-09e83221-0f84-40a9-b0fc-7279add1f0ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:01 compute-0 nova_compute[182725]: 2026-01-22 22:19:01.547 182729 DEBUG nova.compute.manager [req-7e323d2f-b83f-4121-bad9-56704dcd38ef req-09e83221-0f84-40a9-b0fc-7279add1f0ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:01 compute-0 nova_compute[182725]: 2026-01-22 22:19:01.547 182729 WARNING nova.compute.manager [req-7e323d2f-b83f-4121-bad9-56704dcd38ef req-09e83221-0f84-40a9-b0fc-7279add1f0ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state None.
Jan 22 22:19:01 compute-0 nova_compute[182725]: 2026-01-22 22:19:01.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:03 compute-0 nova_compute[182725]: 2026-01-22 22:19:03.525 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:03 compute-0 nova_compute[182725]: 2026-01-22 22:19:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:03 compute-0 nova_compute[182725]: 2026-01-22 22:19:03.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:19:03 compute-0 nova_compute[182725]: 2026-01-22 22:19:03.904 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.144 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Check if temp file /var/lib/nova/instances/tmp6c7184m7 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.148 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.242 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.243 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.329 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.331 182729 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6c7184m7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.910 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.911 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.911 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:19:04 compute-0 nova_compute[182725]: 2026-01-22 22:19:04.973 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:19:05 compute-0 nova_compute[182725]: 2026-01-22 22:19:05.362 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:05 compute-0 nova_compute[182725]: 2026-01-22 22:19:05.457 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:05 compute-0 nova_compute[182725]: 2026-01-22 22:19:05.459 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:05 compute-0 nova_compute[182725]: 2026-01-22 22:19:05.514 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:06 compute-0 podman[213870]: 2026-01-22 22:19:06.162919126 +0000 UTC m=+0.081169214 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:19:06 compute-0 podman[213869]: 2026-01-22 22:19:06.17422448 +0000 UTC m=+0.093195696 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:19:06 compute-0 nova_compute[182725]: 2026-01-22 22:19:06.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:06 compute-0 nova_compute[182725]: 2026-01-22 22:19:06.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:06 compute-0 nova_compute[182725]: 2026-01-22 22:19:06.941 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:07 compute-0 nova_compute[182725]: 2026-01-22 22:19:07.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:07 compute-0 nova_compute[182725]: 2026-01-22 22:19:07.921 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:07 compute-0 nova_compute[182725]: 2026-01-22 22:19:07.922 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:07 compute-0 nova_compute[182725]: 2026-01-22 22:19:07.922 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:07 compute-0 nova_compute[182725]: 2026-01-22 22:19:07.923 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.015 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:08 compute-0 sshd-session[213908]: Accepted publickey for nova from 192.168.122.102 port 58650 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:19:08 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 22:19:08 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 22:19:08 compute-0 systemd-logind[801]: New session 30 of user nova.
Jan 22 22:19:08 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 22:19:08 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.126 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json" returned: 0 in 0.110s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.127 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:08 compute-0 systemd[213914]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.190 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.203 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.274 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.275 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:08 compute-0 systemd[213914]: Queued start job for default target Main User Target.
Jan 22 22:19:08 compute-0 systemd[213914]: Created slice User Application Slice.
Jan 22 22:19:08 compute-0 systemd[213914]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:19:08 compute-0 systemd[213914]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:19:08 compute-0 systemd[213914]: Reached target Paths.
Jan 22 22:19:08 compute-0 systemd[213914]: Reached target Timers.
Jan 22 22:19:08 compute-0 systemd[213914]: Starting D-Bus User Message Bus Socket...
Jan 22 22:19:08 compute-0 systemd[213914]: Starting Create User's Volatile Files and Directories...
Jan 22 22:19:08 compute-0 systemd[213914]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:19:08 compute-0 systemd[213914]: Reached target Sockets.
Jan 22 22:19:08 compute-0 systemd[213914]: Finished Create User's Volatile Files and Directories.
Jan 22 22:19:08 compute-0 systemd[213914]: Reached target Basic System.
Jan 22 22:19:08 compute-0 systemd[213914]: Reached target Main User Target.
Jan 22 22:19:08 compute-0 systemd[213914]: Startup finished in 177ms.
Jan 22 22:19:08 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.342 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:08 compute-0 systemd[1]: Started Session 30 of User nova.
Jan 22 22:19:08 compute-0 sshd-session[213908]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:19:08 compute-0 sshd-session[213940]: Received disconnect from 192.168.122.102 port 58650:11: disconnected by user
Jan 22 22:19:08 compute-0 sshd-session[213940]: Disconnected from user nova 192.168.122.102 port 58650
Jan 22 22:19:08 compute-0 sshd-session[213908]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:19:08 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Jan 22 22:19:08 compute-0 systemd-logind[801]: Session 30 logged out. Waiting for processes to exit.
Jan 22 22:19:08 compute-0 systemd-logind[801]: Removed session 30.
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.526 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.539 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.540 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5409MB free_disk=73.35182571411133GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.541 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.541 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.599 182729 INFO nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating resource usage from migration d66e6672-9239-4702-afa5-407b94c993b2
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.738 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 008af030-d785-4936-871a-4d52ccebc8f8 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.738 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Migration d66e6672-9239-4702-afa5-407b94c993b2 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.738 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:19:08 compute-0 nova_compute[182725]: 2026-01-22 22:19:08.739 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:19:09 compute-0 nova_compute[182725]: 2026-01-22 22:19:09.045 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:09 compute-0 nova_compute[182725]: 2026-01-22 22:19:09.190 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:09 compute-0 nova_compute[182725]: 2026-01-22 22:19:09.230 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:19:09 compute-0 nova_compute[182725]: 2026-01-22 22:19:09.231 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:10 compute-0 podman[213942]: 2026-01-22 22:19:10.171221017 +0000 UTC m=+0.096914850 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.232 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.233 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.233 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.234 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.385 182729 DEBUG nova.compute.manager [req-7015edfd-fccc-4d43-8cea-b3b9fec6a1c2 req-dba65feb-353b-4299-8b98-efb2a0757ee2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.386 182729 DEBUG oslo_concurrency.lockutils [req-7015edfd-fccc-4d43-8cea-b3b9fec6a1c2 req-dba65feb-353b-4299-8b98-efb2a0757ee2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.386 182729 DEBUG oslo_concurrency.lockutils [req-7015edfd-fccc-4d43-8cea-b3b9fec6a1c2 req-dba65feb-353b-4299-8b98-efb2a0757ee2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.386 182729 DEBUG oslo_concurrency.lockutils [req-7015edfd-fccc-4d43-8cea-b3b9fec6a1c2 req-dba65feb-353b-4299-8b98-efb2a0757ee2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.386 182729 DEBUG nova.compute.manager [req-7015edfd-fccc-4d43-8cea-b3b9fec6a1c2 req-dba65feb-353b-4299-8b98-efb2a0757ee2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.387 182729 DEBUG nova.compute.manager [req-7015edfd-fccc-4d43-8cea-b3b9fec6a1c2 req-dba65feb-353b-4299-8b98-efb2a0757ee2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.838 182729 INFO nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Took 5.32 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.839 182729 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.861 182729 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6c7184m7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d66e6672-9239-4702-afa5-407b94c993b2),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.883 182729 DEBUG nova.objects.instance [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'migration_context' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.885 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.887 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.887 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.908 182729 DEBUG nova.virt.libvirt.vif [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:18:59Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.909 182729 DEBUG nova.network.os_vif_util [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.910 182729 DEBUG nova.network.os_vif_util [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.911 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating guest XML with vif config: <interface type="ethernet">
Jan 22 22:19:10 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:01:a3:f5"/>
Jan 22 22:19:10 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:19:10 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:19:10 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:19:10 compute-0 nova_compute[182725]:   <target dev="tap580dc508-63"/>
Jan 22 22:19:10 compute-0 nova_compute[182725]: </interface>
Jan 22 22:19:10 compute-0 nova_compute[182725]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 22 22:19:10 compute-0 nova_compute[182725]: 2026-01-22 22:19:10.911 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 22 22:19:11 compute-0 nova_compute[182725]: 2026-01-22 22:19:11.391 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:11 compute-0 nova_compute[182725]: 2026-01-22 22:19:11.391 182729 INFO nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 22 22:19:11 compute-0 nova_compute[182725]: 2026-01-22 22:19:11.469 182729 INFO nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 22 22:19:11 compute-0 nova_compute[182725]: 2026-01-22 22:19:11.945 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:11 compute-0 nova_compute[182725]: 2026-01-22 22:19:11.972 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:11 compute-0 nova_compute[182725]: 2026-01-22 22:19:11.973 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:12 compute-0 ovn_controller[94850]: 2026-01-22T22:19:12Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:a3:f5 10.100.0.6
Jan 22 22:19:12 compute-0 ovn_controller[94850]: 2026-01-22T22:19:12Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:a3:f5 10.100.0.6
Jan 22 22:19:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:12.426 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:12.428 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:12.429 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.475 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.476 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.549 182729 DEBUG nova.compute.manager [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.549 182729 DEBUG oslo_concurrency.lockutils [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.550 182729 DEBUG oslo_concurrency.lockutils [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.550 182729 DEBUG oslo_concurrency.lockutils [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.550 182729 DEBUG nova.compute.manager [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.551 182729 WARNING nova.compute.manager [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.551 182729 DEBUG nova.compute.manager [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-changed-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.551 182729 DEBUG nova.compute.manager [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Refreshing instance network info cache due to event network-changed-580dc508-636a-420e-aed2-8efd9dccace5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.552 182729 DEBUG oslo_concurrency.lockutils [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.552 182729 DEBUG oslo_concurrency.lockutils [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.552 182729 DEBUG nova.network.neutron [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Refreshing network info cache for port 580dc508-636a-420e-aed2-8efd9dccace5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.979 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:12 compute-0 nova_compute[182725]: 2026-01-22 22:19:12.979 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.483 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.483 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.528 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.908 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.988 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:13 compute-0 nova_compute[182725]: 2026-01-22 22:19:13.989 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:14 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 22:19:14 compute-0 nova_compute[182725]: 2026-01-22 22:19:14.492 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:14 compute-0 nova_compute[182725]: 2026-01-22 22:19:14.493 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:14 compute-0 nova_compute[182725]: 2026-01-22 22:19:14.997 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:14 compute-0 nova_compute[182725]: 2026-01-22 22:19:14.998 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.502 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.502 182729 DEBUG nova.virt.libvirt.migration [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.702 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120355.7014961, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.702 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Paused (Lifecycle Event)
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.728 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.735 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.776 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 22 22:19:15 compute-0 kernel: tap580dc508-63 (unregistering): left promiscuous mode
Jan 22 22:19:15 compute-0 NetworkManager[54954]: <info>  [1769120355.8606] device (tap580dc508-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:19:15 compute-0 ovn_controller[94850]: 2026-01-22T22:19:15Z|00079|binding|INFO|Releasing lport 580dc508-636a-420e-aed2-8efd9dccace5 from this chassis (sb_readonly=0)
Jan 22 22:19:15 compute-0 ovn_controller[94850]: 2026-01-22T22:19:15Z|00080|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 down in Southbound
Jan 22 22:19:15 compute-0 ovn_controller[94850]: 2026-01-22T22:19:15Z|00081|binding|INFO|Removing iface tap580dc508-63 ovn-installed in OVS
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.872 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.876 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:15.890 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:a3:f5 10.100.0.6'], port_security=['fa:16:3e:01:a3:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e130c2ec-fef7-4ed2-892d-1e3d7eaab401'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=580dc508-636a-420e-aed2-8efd9dccace5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:19:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:15.892 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 580dc508-636a-420e-aed2-8efd9dccace5 in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea unbound from our chassis
Jan 22 22:19:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:15.894 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:19:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:15.896 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0a10b5-18e2-4216-be60-43476abffe6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:15 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:15.897 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace which is not needed anymore
Jan 22 22:19:15 compute-0 nova_compute[182725]: 2026-01-22 22:19:15.899 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:15 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 22 22:19:15 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000016.scope: Consumed 13.081s CPU time.
Jan 22 22:19:15 compute-0 systemd-machined[154006]: Machine qemu-8-instance-00000016 terminated.
Jan 22 22:19:16 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[213841]: [NOTICE]   (213845) : haproxy version is 2.8.14-c23fe91
Jan 22 22:19:16 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[213841]: [NOTICE]   (213845) : path to executable is /usr/sbin/haproxy
Jan 22 22:19:16 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[213841]: [WARNING]  (213845) : Exiting Master process...
Jan 22 22:19:16 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[213841]: [ALERT]    (213845) : Current worker (213847) exited with code 143 (Terminated)
Jan 22 22:19:16 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[213841]: [WARNING]  (213845) : All workers exited. Exiting... (0)
Jan 22 22:19:16 compute-0 systemd[1]: libpod-9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126.scope: Deactivated successfully.
Jan 22 22:19:16 compute-0 podman[214008]: 2026-01-22 22:19:16.0929673 +0000 UTC m=+0.047559447 container died 9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.092 182729 DEBUG nova.network.neutron [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updated VIF entry in instance network info cache for port 580dc508-636a-420e-aed2-8efd9dccace5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.094 182729 DEBUG nova.network.neutron [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.101 182729 DEBUG nova.virt.libvirt.guest [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.102 182729 INFO nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration operation has completed
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.103 182729 INFO nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] _post_live_migration() is started..
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.106 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.106 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.107 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.109 182729 DEBUG oslo_concurrency.lockutils [req-fe0b373c-f4d2-4980-b028-a9c6a8fbd19f req-28a7bd26-11cf-4c61-b71d-c2c008ac0df9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126-userdata-shm.mount: Deactivated successfully.
Jan 22 22:19:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-178d76b2f687d89584b1584a0551919b2dbbe85d19aa54e20028a78942fe2b97-merged.mount: Deactivated successfully.
Jan 22 22:19:16 compute-0 podman[214008]: 2026-01-22 22:19:16.146770394 +0000 UTC m=+0.101362551 container cleanup 9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 22:19:16 compute-0 systemd[1]: libpod-conmon-9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126.scope: Deactivated successfully.
Jan 22 22:19:16 compute-0 podman[214054]: 2026-01-22 22:19:16.211223766 +0000 UTC m=+0.041548786 container remove 9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.218 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3db36d1c-5b25-4ce2-90ea-4aa89f91bedd]: (4, ('Thu Jan 22 10:19:16 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126)\n9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126\nThu Jan 22 10:19:16 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126)\n9dd059b8a469909602ebbb5c71ea366c997b1bb772a5efc8841bbd0693ce4126\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.221 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ef5e7e-4f88-4339-96ee-48b5ce19f433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.222 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:16 compute-0 kernel: tap698e77c5-f0: left promiscuous mode
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.225 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.243 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.245 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5ccb730f-8273-4b73-84df-02df29486f29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.261 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d53996ec-0fc8-4132-9f1b-9630ffc547da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.263 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8cd291-3e77-456c-a7ce-f595c690a3dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.282 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5d85d2bb-4387-4290-93ab-01212762efa1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397591, 'reachable_time': 18495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214073, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d698e77c5\x2dfce6\x2d47a5\x2db6e3\x2df4c56da226ea.mount: Deactivated successfully.
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.287 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:19:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:16.287 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[16f34a8d-89b0-46e9-9d90-d434580fa5fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.327 182729 DEBUG nova.compute.manager [req-e046691a-8fa0-4671-93c1-4f01b87b24cb req-212ea0f8-414c-4ea7-9c22-605dd2499c23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.328 182729 DEBUG oslo_concurrency.lockutils [req-e046691a-8fa0-4671-93c1-4f01b87b24cb req-212ea0f8-414c-4ea7-9c22-605dd2499c23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.328 182729 DEBUG oslo_concurrency.lockutils [req-e046691a-8fa0-4671-93c1-4f01b87b24cb req-212ea0f8-414c-4ea7-9c22-605dd2499c23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.329 182729 DEBUG oslo_concurrency.lockutils [req-e046691a-8fa0-4671-93c1-4f01b87b24cb req-212ea0f8-414c-4ea7-9c22-605dd2499c23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.329 182729 DEBUG nova.compute.manager [req-e046691a-8fa0-4671-93c1-4f01b87b24cb req-212ea0f8-414c-4ea7-9c22-605dd2499c23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.329 182729 DEBUG nova.compute.manager [req-e046691a-8fa0-4671-93c1-4f01b87b24cb req-212ea0f8-414c-4ea7-9c22-605dd2499c23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.560 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.561 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.579 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.677 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "be44208f-27c9-4da7-a5bc-5c2583fdb393" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.677 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "be44208f-27c9-4da7-a5bc-5c2583fdb393" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.700 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.715 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.715 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.726 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.727 182729 INFO nova.compute.claims [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.843 182729 DEBUG nova.compute.manager [req-e543b850-ee3e-4e9e-960e-03657dda1c4a req-7d26a8bf-bc6b-4f17-a0d7-eca63cfb706a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.843 182729 DEBUG oslo_concurrency.lockutils [req-e543b850-ee3e-4e9e-960e-03657dda1c4a req-7d26a8bf-bc6b-4f17-a0d7-eca63cfb706a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.845 182729 DEBUG oslo_concurrency.lockutils [req-e543b850-ee3e-4e9e-960e-03657dda1c4a req-7d26a8bf-bc6b-4f17-a0d7-eca63cfb706a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.845 182729 DEBUG oslo_concurrency.lockutils [req-e543b850-ee3e-4e9e-960e-03657dda1c4a req-7d26a8bf-bc6b-4f17-a0d7-eca63cfb706a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.846 182729 DEBUG nova.compute.manager [req-e543b850-ee3e-4e9e-960e-03657dda1c4a req-7d26a8bf-bc6b-4f17-a0d7-eca63cfb706a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.846 182729 DEBUG nova.compute.manager [req-e543b850-ee3e-4e9e-960e-03657dda1c4a req-7d26a8bf-bc6b-4f17-a0d7-eca63cfb706a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.851 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.948 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.953 182729 DEBUG nova.compute.provider_tree [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.958 182729 DEBUG nova.network.neutron [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Activated binding for port 580dc508-636a-420e-aed2-8efd9dccace5 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.958 182729 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.959 182729 DEBUG nova.virt.libvirt.vif [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:19:03Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.960 182729 DEBUG nova.network.os_vif_util [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.961 182729 DEBUG nova.network.os_vif_util [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.961 182729 DEBUG os_vif [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.963 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.964 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap580dc508-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.966 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.968 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.971 182729 INFO os_vif [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63')
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.972 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:16 compute-0 nova_compute[182725]: 2026-01-22 22:19:16.974 182729 DEBUG nova.scheduler.client.report [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.000 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.002 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.010 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.010 182729 INFO nova.compute.claims [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.019 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "1f73a4ac-704d-4ff6-abe1-f9f2dae0807f" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.019 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1f73a4ac-704d-4ff6-abe1-f9f2dae0807f" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.052 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1f73a4ac-704d-4ff6-abe1-f9f2dae0807f" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.054 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.117 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.117 182729 DEBUG nova.network.neutron [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.134 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.159 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.246 182729 DEBUG nova.compute.provider_tree [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.263 182729 DEBUG nova.scheduler.client.report [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.305 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.307 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.308 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.308 182729 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.309 182729 INFO nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Deleting instance files /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed_del
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.310 182729 INFO nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Deletion of /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed_del complete
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.324 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.326 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.327 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Creating image(s)
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.327 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "/var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.328 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.329 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.360 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "1f73a4ac-704d-4ff6-abe1-f9f2dae0807f" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.362 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1f73a4ac-704d-4ff6-abe1-f9f2dae0807f" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.366 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.406 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1f73a4ac-704d-4ff6-abe1-f9f2dae0807f" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.412 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.471 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.472 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.473 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.498 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.528 182729 DEBUG nova.network.neutron [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.529 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.537 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.537 182729 DEBUG nova.network.neutron [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.580 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.581 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.612 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.639 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.641 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.642 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.675 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.726 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.728 182729 DEBUG nova.virt.disk.api [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Checking if we can resize image /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.729 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.812 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.814 182729 DEBUG nova.virt.disk.api [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Cannot resize image /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.815 182729 DEBUG nova.objects.instance [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'migration_context' on Instance uuid 1c330292-7fe1-4a26-a2d0-85ee27c734f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.842 182729 DEBUG nova.network.neutron [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.842 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.872 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.873 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Ensure instance console log exists: /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.874 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.874 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.875 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.878 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.886 182729 WARNING nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.895 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.896 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.901 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.902 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.904 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.905 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.905 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.906 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.906 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.907 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.907 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.908 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.908 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.909 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.909 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.910 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.917 182729 DEBUG nova.objects.instance [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c330292-7fe1-4a26-a2d0-85ee27c734f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.938 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.940 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.941 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Creating image(s)
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.942 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "/var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.943 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.944 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.967 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <uuid>1c330292-7fe1-4a26-a2d0-85ee27c734f0</uuid>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <name>instance-00000019</name>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersOnMultiNodesTest-server-359807656-1</nova:name>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:19:17</nova:creationTime>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:19:17 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:19:17 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:19:17 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:19:17 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:19:17 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:19:17 compute-0 nova_compute[182725]:         <nova:user uuid="7c7976c0d8814435b29d032d44312d82">tempest-ServersOnMultiNodesTest-1342288026-project-member</nova:user>
Jan 22 22:19:17 compute-0 nova_compute[182725]:         <nova:project uuid="bb26d5e006aa4c1a8f553f412a76778a">tempest-ServersOnMultiNodesTest-1342288026</nova:project>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <system>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <entry name="serial">1c330292-7fe1-4a26-a2d0-85ee27c734f0</entry>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <entry name="uuid">1c330292-7fe1-4a26-a2d0-85ee27c734f0</entry>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </system>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <os>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   </os>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <features>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   </features>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk.config"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/console.log" append="off"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <video>
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </video>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:19:17 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:19:17 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:19:17 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:19:17 compute-0 nova_compute[182725]: </domain>
Jan 22 22:19:17 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:19:17 compute-0 nova_compute[182725]: 2026-01-22 22:19:17.969 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.065 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.066 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.067 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Using config drive
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.071 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.072 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.073 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.097 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.158 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.160 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.206 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.208 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.209 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.302 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.304 182729 DEBUG nova.virt.disk.api [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Checking if we can resize image /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.305 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.387 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.389 182729 DEBUG nova.virt.disk.api [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Cannot resize image /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.389 182729 DEBUG nova.objects.instance [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'migration_context' on Instance uuid be44208f-27c9-4da7-a5bc-5c2583fdb393 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.412 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.413 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Ensure instance console log exists: /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.414 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.415 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.415 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.418 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.427 182729 WARNING nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.434 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.436 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.440 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.441 182729 DEBUG nova.virt.libvirt.host [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.443 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.443 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.444 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.445 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.445 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.445 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.446 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.446 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.447 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.447 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.447 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.448 182729 DEBUG nova.virt.hardware [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.454 182729 DEBUG nova.objects.instance [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'pci_devices' on Instance uuid be44208f-27c9-4da7-a5bc-5c2583fdb393 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.478 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <uuid>be44208f-27c9-4da7-a5bc-5c2583fdb393</uuid>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <name>instance-0000001a</name>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersOnMultiNodesTest-server-359807656-2</nova:name>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:19:18</nova:creationTime>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:19:18 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:19:18 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:19:18 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:19:18 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:19:18 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:19:18 compute-0 nova_compute[182725]:         <nova:user uuid="7c7976c0d8814435b29d032d44312d82">tempest-ServersOnMultiNodesTest-1342288026-project-member</nova:user>
Jan 22 22:19:18 compute-0 nova_compute[182725]:         <nova:project uuid="bb26d5e006aa4c1a8f553f412a76778a">tempest-ServersOnMultiNodesTest-1342288026</nova:project>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <system>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <entry name="serial">be44208f-27c9-4da7-a5bc-5c2583fdb393</entry>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <entry name="uuid">be44208f-27c9-4da7-a5bc-5c2583fdb393</entry>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </system>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <os>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   </os>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <features>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   </features>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk.config"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/console.log" append="off"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <video>
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </video>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:19:18 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:19:18 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:19:18 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:19:18 compute-0 nova_compute[182725]: </domain>
Jan 22 22:19:18 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.532 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:18 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.543 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.544 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:18 compute-0 systemd[213914]: Activating special unit Exit the Session...
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped target Main User Target.
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped target Basic System.
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped target Paths.
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped target Sockets.
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped target Timers.
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.545 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Using config drive
Jan 22 22:19:18 compute-0 systemd[213914]: Closed D-Bus User Message Bus Socket.
Jan 22 22:19:18 compute-0 systemd[213914]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:19:18 compute-0 systemd[213914]: Removed slice User Application Slice.
Jan 22 22:19:18 compute-0 systemd[213914]: Reached target Shutdown.
Jan 22 22:19:18 compute-0 systemd[213914]: Finished Exit the Session.
Jan 22 22:19:18 compute-0 systemd[213914]: Reached target Exit the Session.
Jan 22 22:19:18 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 22:19:18 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 22:19:18 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 22:19:18 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 22:19:18 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 22:19:18 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 22:19:18 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.711 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Creating config drive at /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk.config
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.719 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplb_mt30t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.832 182729 INFO nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Creating config drive at /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk.config
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.842 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp39jaj4x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.869 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplb_mt30t" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:18 compute-0 systemd-machined[154006]: New machine qemu-9-instance-00000019.
Jan 22 22:19:18 compute-0 nova_compute[182725]: 2026-01-22 22:19:18.982 182729 DEBUG oslo_concurrency.processutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp39jaj4x" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:18 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000019.
Jan 22 22:19:19 compute-0 systemd-machined[154006]: New machine qemu-10-instance-0000001a.
Jan 22 22:19:19 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000001a.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.498 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120359.498157, be44208f-27c9-4da7-a5bc-5c2583fdb393 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.499 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] VM Resumed (Lifecycle Event)
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.504 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.505 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.511 182729 INFO nova.virt.libvirt.driver [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Instance spawned successfully.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.511 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.526 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.535 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.540 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.541 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.541 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.542 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.543 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.543 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.588 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.588 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120359.5039208, be44208f-27c9-4da7-a5bc-5c2583fdb393 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.589 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] VM Started (Lifecycle Event)
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.618 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.623 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.645 182729 INFO nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Took 1.71 seconds to spawn the instance on the hypervisor.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.646 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.648 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.822 182729 INFO nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Took 3.04 seconds to build instance.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.847 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "be44208f-27c9-4da7-a5bc-5c2583fdb393" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.863 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.864 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.864 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.864 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.865 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.865 182729 WARNING nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.865 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.866 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.866 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.866 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.867 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.867 182729 WARNING nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.867 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.867 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.868 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.868 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.868 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.869 182729 WARNING nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.869 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.869 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.870 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.870 182729 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.870 182729 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.870 182729 WARNING nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.952 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120359.9524171, 1c330292-7fe1-4a26-a2d0-85ee27c734f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.953 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] VM Resumed (Lifecycle Event)
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.955 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.956 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.962 182729 INFO nova.virt.libvirt.driver [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Instance spawned successfully.
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.962 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:19:19 compute-0 nova_compute[182725]: 2026-01-22 22:19:19.992 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.001 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.008 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.008 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.009 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.010 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.011 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.012 182729 DEBUG nova.virt.libvirt.driver [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.024 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.025 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120359.9525445, 1c330292-7fe1-4a26-a2d0-85ee27c734f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.026 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] VM Started (Lifecycle Event)
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.058 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.062 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.086 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.112 182729 INFO nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Took 2.79 seconds to spawn the instance on the hypervisor.
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.113 182729 DEBUG nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.205 182729 INFO nova.compute.manager [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Took 3.54 seconds to build instance.
Jan 22 22:19:20 compute-0 nova_compute[182725]: 2026-01-22 22:19:20.225 182729 DEBUG oslo_concurrency.lockutils [None req-9d570f50-f13c-4a45-a229-a54a2c70d486 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:21 compute-0 nova_compute[182725]: 2026-01-22 22:19:21.965 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:23 compute-0 podman[214159]: 2026-01-22 22:19:23.17521672 +0000 UTC m=+0.090236862 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.532 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.942 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.943 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.943 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.974 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.975 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.975 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:23 compute-0 nova_compute[182725]: 2026-01-22 22:19:23.975 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.083 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.154 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.156 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.218 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.227 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.284 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.286 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.345 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.355 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.460 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.462 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.527 182729 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.740 182729 WARNING nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.741 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5309MB free_disk=73.35087203979492GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.741 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.741 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.826 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Migration for instance 469eaf2b-7d53-40c9-a233-b27d702a21ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.863 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.913 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Instance 008af030-d785-4936-871a-4d52ccebc8f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.914 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Migration d66e6672-9239-4702-afa5-407b94c993b2 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.914 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Instance 1c330292-7fe1-4a26-a2d0-85ee27c734f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.914 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Instance be44208f-27c9-4da7-a5bc-5c2583fdb393 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.915 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:19:24 compute-0 nova_compute[182725]: 2026-01-22 22:19:24.915 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:19:25 compute-0 nova_compute[182725]: 2026-01-22 22:19:25.097 182729 DEBUG nova.compute.provider_tree [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:25 compute-0 nova_compute[182725]: 2026-01-22 22:19:25.119 182729 DEBUG nova.scheduler.client.report [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:25 compute-0 nova_compute[182725]: 2026-01-22 22:19:25.157 182729 DEBUG nova.compute.resource_tracker [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:19:25 compute-0 nova_compute[182725]: 2026-01-22 22:19:25.157 182729 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:25 compute-0 nova_compute[182725]: 2026-01-22 22:19:25.188 182729 INFO nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Jan 22 22:19:25 compute-0 nova_compute[182725]: 2026-01-22 22:19:25.278 182729 INFO nova.scheduler.client.report [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Deleted allocation for migration d66e6672-9239-4702-afa5-407b94c993b2
Jan 22 22:19:25 compute-0 nova_compute[182725]: 2026-01-22 22:19:25.279 182729 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.316 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.318 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.347 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.450 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.451 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.462 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.463 182729 INFO nova.compute.claims [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.753 182729 DEBUG nova.compute.provider_tree [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.770 182729 DEBUG nova.scheduler.client.report [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.803 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.847 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "37c44a11-c0a4-4e9d-b692-bf2463a4d6cf" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.847 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "37c44a11-c0a4-4e9d-b692-bf2463a4d6cf" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.872 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] No node specified, defaulting to compute-0.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.910 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "37c44a11-c0a4-4e9d-b692-bf2463a4d6cf" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.913 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:19:26 compute-0 nova_compute[182725]: 2026-01-22 22:19:26.967 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.002 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.003 182729 DEBUG nova.network.neutron [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.044 182729 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.062 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:19:27 compute-0 podman[214199]: 2026-01-22 22:19:27.132737503 +0000 UTC m=+0.061968911 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6)
Jan 22 22:19:27 compute-0 podman[214198]: 2026-01-22 22:19:27.162860231 +0000 UTC m=+0.094905690 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.220 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.221 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.222 182729 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Creating image(s)
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.223 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "/var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.223 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.224 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.253 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.346 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.350 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.351 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.366 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.427 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.428 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.484 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.486 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.487 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.556 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.557 182729 DEBUG nova.virt.disk.api [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Checking if we can resize image /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.558 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.596 182729 DEBUG nova.network.neutron [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.597 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.639 182729 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating tmpfile /var/lib/nova/instances/tmpyr59vjpp to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.641 182729 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr59vjpp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.657 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.658 182729 DEBUG nova.virt.disk.api [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Cannot resize image /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.659 182729 DEBUG nova.objects.instance [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'migration_context' on Instance uuid c20e5ebc-adf4-4b83-af3a-908b9b574a25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.677 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.677 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Ensure instance console log exists: /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.678 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.678 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.678 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.679 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.686 182729 WARNING nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.691 182729 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.693 182729 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.697 182729 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.698 182729 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.699 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.700 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.701 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.701 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.702 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.703 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.703 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.704 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.704 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.705 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.705 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.705 182729 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.710 182729 DEBUG nova.objects.instance [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'pci_devices' on Instance uuid c20e5ebc-adf4-4b83-af3a-908b9b574a25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:27 compute-0 nova_compute[182725]: 2026-01-22 22:19:27.732 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <uuid>c20e5ebc-adf4-4b83-af3a-908b9b574a25</uuid>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <name>instance-0000001b</name>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersOnMultiNodesTest-server-975595322-1</nova:name>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:19:27</nova:creationTime>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:19:27 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:19:27 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:19:27 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:19:27 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:19:27 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:19:27 compute-0 nova_compute[182725]:         <nova:user uuid="7c7976c0d8814435b29d032d44312d82">tempest-ServersOnMultiNodesTest-1342288026-project-member</nova:user>
Jan 22 22:19:27 compute-0 nova_compute[182725]:         <nova:project uuid="bb26d5e006aa4c1a8f553f412a76778a">tempest-ServersOnMultiNodesTest-1342288026</nova:project>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <system>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <entry name="serial">c20e5ebc-adf4-4b83-af3a-908b9b574a25</entry>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <entry name="uuid">c20e5ebc-adf4-4b83-af3a-908b9b574a25</entry>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </system>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <os>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   </os>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <features>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   </features>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk.config"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/console.log" append="off"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <video>
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </video>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:19:27 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:19:27 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:19:27 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:19:27 compute-0 nova_compute[182725]: </domain>
Jan 22 22:19:27 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.111 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.112 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.113 182729 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Using config drive
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.703 182729 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Creating config drive at /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk.config
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.712 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpox6lql6i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.843 182729 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpox6lql6i" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:28 compute-0 systemd-machined[154006]: New machine qemu-11-instance-0000001b.
Jan 22 22:19:28 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000001b.
Jan 22 22:19:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:28.971 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:19:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:28.973 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:19:28 compute-0 nova_compute[182725]: 2026-01-22 22:19:28.973 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.349 182729 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr59vjpp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.373 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.374 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.374 182729 DEBUG nova.network.neutron [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.430 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120369.42859, c20e5ebc-adf4-4b83-af3a-908b9b574a25 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.430 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] VM Resumed (Lifecycle Event)
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.433 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.433 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.449 182729 INFO nova.virt.libvirt.driver [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Instance spawned successfully.
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.449 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.466 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.477 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.484 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.485 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.486 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.486 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.487 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.488 182729 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.514 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.515 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120369.4287524, c20e5ebc-adf4-4b83-af3a-908b9b574a25 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.516 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] VM Started (Lifecycle Event)
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.547 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.552 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.572 182729 INFO nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Took 2.35 seconds to spawn the instance on the hypervisor.
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.572 182729 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.576 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.653 182729 INFO nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Took 3.25 seconds to build instance.
Jan 22 22:19:29 compute-0 nova_compute[182725]: 2026-01-22 22:19:29.668 182729 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.100 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120356.0993161, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.101 182729 INFO nova.compute.manager [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Stopped (Lifecycle Event)
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.218 182729 DEBUG nova.compute.manager [None req-7361572f-2f33-4919-b880-e96cd496ed92 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.367 182729 DEBUG nova.network.neutron [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.511 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.524 182729 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr59vjpp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.524 182729 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating instance directory: /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.525 182729 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating disk.info with the contents: {'/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk': 'qcow2', '/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.526 182729 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.527 182729 DEBUG nova.objects.instance [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'trusted_certs' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.561 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.628 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.631 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.632 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.648 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.711 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.712 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.751 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.752 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.753 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.812 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.813 182729 DEBUG nova.virt.disk.api [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Checking if we can resize image /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.814 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.873 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.874 182729 DEBUG nova.virt.disk.api [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Cannot resize image /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.875 182729 DEBUG nova.objects.instance [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'migration_context' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.891 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.919 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.922 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config to /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.922 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:31 compute-0 nova_compute[182725]: 2026-01-22 22:19:31.969 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.354 182729 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.355 182729 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.357 182729 DEBUG nova.virt.libvirt.vif [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:23Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.357 182729 DEBUG nova.network.os_vif_util [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.359 182729 DEBUG nova.network.os_vif_util [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.359 182729 DEBUG os_vif [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.360 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.360 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.361 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.364 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.365 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap580dc508-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.365 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap580dc508-63, col_values=(('external_ids', {'iface-id': '580dc508-636a-420e-aed2-8efd9dccace5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:a3:f5', 'vm-uuid': '469eaf2b-7d53-40c9-a233-b27d702a21ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:32 compute-0 NetworkManager[54954]: <info>  [1769120372.3681] manager: (tap580dc508-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.371 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.378 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.379 182729 INFO os_vif [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63')
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.380 182729 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 22 22:19:32 compute-0 nova_compute[182725]: 2026-01-22 22:19:32.380 182729 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr59vjpp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 22 22:19:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:32.977 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.121 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.123 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.123 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.124 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.124 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.137 182729 INFO nova.compute.manager [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Terminating instance
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.149 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "refresh_cache-c20e5ebc-adf4-4b83-af3a-908b9b574a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.149 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquired lock "refresh_cache-c20e5ebc-adf4-4b83-af3a-908b9b574a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.150 182729 DEBUG nova.network.neutron [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.444 182729 DEBUG nova.network.neutron [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.537 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.919 182729 DEBUG nova.network.neutron [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Port 580dc508-636a-420e-aed2-8efd9dccace5 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 22 22:19:33 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.931 182729 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr59vjpp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:33.999 182729 DEBUG nova.network.neutron [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.016 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Releasing lock "refresh_cache-c20e5ebc-adf4-4b83-af3a-908b9b574a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.016 182729 DEBUG nova.compute.manager [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:19:34 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 22 22:19:34 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001b.scope: Consumed 5.105s CPU time.
Jan 22 22:19:34 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 22 22:19:34 compute-0 systemd-machined[154006]: Machine qemu-11-instance-0000001b terminated.
Jan 22 22:19:34 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.310 182729 INFO nova.virt.libvirt.driver [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Instance destroyed successfully.
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.312 182729 DEBUG nova.objects.instance [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'resources' on Instance uuid c20e5ebc-adf4-4b83-af3a-908b9b574a25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.331 182729 INFO nova.virt.libvirt.driver [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Deleting instance files /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25_del
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.333 182729 INFO nova.virt.libvirt.driver [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Deletion of /var/lib/nova/instances/c20e5ebc-adf4-4b83-af3a-908b9b574a25_del complete
Jan 22 22:19:34 compute-0 kernel: tap580dc508-63: entered promiscuous mode
Jan 22 22:19:34 compute-0 NetworkManager[54954]: <info>  [1769120374.3490] manager: (tap580dc508-63): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Jan 22 22:19:34 compute-0 ovn_controller[94850]: 2026-01-22T22:19:34Z|00082|binding|INFO|Claiming lport 580dc508-636a-420e-aed2-8efd9dccace5 for this additional chassis.
Jan 22 22:19:34 compute-0 systemd-udevd[214335]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.351 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:34 compute-0 ovn_controller[94850]: 2026-01-22T22:19:34Z|00083|binding|INFO|580dc508-636a-420e-aed2-8efd9dccace5: Claiming fa:16:3e:01:a3:f5 10.100.0.6
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.354 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:34 compute-0 ovn_controller[94850]: 2026-01-22T22:19:34Z|00084|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 ovn-installed in OVS
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.366 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.367 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:34 compute-0 NetworkManager[54954]: <info>  [1769120374.3745] device (tap580dc508-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:19:34 compute-0 NetworkManager[54954]: <info>  [1769120374.3755] device (tap580dc508-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:19:34 compute-0 systemd-machined[154006]: New machine qemu-12-instance-00000016.
Jan 22 22:19:34 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000016.
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.443 182729 INFO nova.compute.manager [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.445 182729 DEBUG oslo.service.loopingcall [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.445 182729 DEBUG nova.compute.manager [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.446 182729 DEBUG nova.network.neutron [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.705 182729 DEBUG nova.network.neutron [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.720 182729 DEBUG nova.network.neutron [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.753 182729 INFO nova.compute.manager [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Took 0.31 seconds to deallocate network for instance.
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.886 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:34 compute-0 nova_compute[182725]: 2026-01-22 22:19:34.886 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.074 182729 DEBUG nova.compute.provider_tree [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.096 182729 DEBUG nova.scheduler.client.report [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.123 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.150 182729 INFO nova.scheduler.client.report [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Deleted allocations for instance c20e5ebc-adf4-4b83-af3a-908b9b574a25
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.186 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120375.1865098, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.186 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Started (Lifecycle Event)
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.210 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:35 compute-0 nova_compute[182725]: 2026-01-22 22:19:35.250 182729 DEBUG oslo_concurrency.lockutils [None req-42e161b9-2500-457b-a564-63f1e2518053 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "c20e5ebc-adf4-4b83-af3a-908b9b574a25" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.622 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120376.6222951, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.623 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Resumed (Lifecycle Event)
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.643 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.647 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.683 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.985 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.985 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.986 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.986 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.986 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:36 compute-0 nova_compute[182725]: 2026-01-22 22:19:36.997 182729 INFO nova.compute.manager [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Terminating instance
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.011 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "refresh_cache-1c330292-7fe1-4a26-a2d0-85ee27c734f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.011 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquired lock "refresh_cache-1c330292-7fe1-4a26-a2d0-85ee27c734f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.012 182729 DEBUG nova.network.neutron [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:19:37 compute-0 podman[214402]: 2026-01-22 22:19:37.145846397 +0000 UTC m=+0.070337951 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:19:37 compute-0 podman[214401]: 2026-01-22 22:19:37.146476463 +0000 UTC m=+0.071011918 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.227 182729 DEBUG nova.network.neutron [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.277 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "be44208f-27c9-4da7-a5bc-5c2583fdb393" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.278 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "be44208f-27c9-4da7-a5bc-5c2583fdb393" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.279 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "be44208f-27c9-4da7-a5bc-5c2583fdb393-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.279 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "be44208f-27c9-4da7-a5bc-5c2583fdb393-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.280 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "be44208f-27c9-4da7-a5bc-5c2583fdb393-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.292 182729 INFO nova.compute.manager [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Terminating instance
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.306 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "refresh_cache-be44208f-27c9-4da7-a5bc-5c2583fdb393" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.307 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquired lock "refresh_cache-be44208f-27c9-4da7-a5bc-5c2583fdb393" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.307 182729 DEBUG nova.network.neutron [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.367 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.697 182729 DEBUG nova.network.neutron [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.729 182729 DEBUG nova.network.neutron [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.756 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Releasing lock "refresh_cache-1c330292-7fe1-4a26-a2d0-85ee27c734f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:37 compute-0 nova_compute[182725]: 2026-01-22 22:19:37.757 182729 DEBUG nova.compute.manager [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:19:37 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 22 22:19:37 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000019.scope: Consumed 12.930s CPU time.
Jan 22 22:19:37 compute-0 systemd-machined[154006]: Machine qemu-9-instance-00000019 terminated.
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.033 182729 DEBUG nova.network.neutron [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.040 182729 INFO nova.virt.libvirt.driver [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Instance destroyed successfully.
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.041 182729 DEBUG nova.objects.instance [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'resources' on Instance uuid 1c330292-7fe1-4a26-a2d0-85ee27c734f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:38 compute-0 ovn_controller[94850]: 2026-01-22T22:19:38Z|00085|binding|INFO|Claiming lport 580dc508-636a-420e-aed2-8efd9dccace5 for this chassis.
Jan 22 22:19:38 compute-0 ovn_controller[94850]: 2026-01-22T22:19:38Z|00086|binding|INFO|580dc508-636a-420e-aed2-8efd9dccace5: Claiming fa:16:3e:01:a3:f5 10.100.0.6
Jan 22 22:19:38 compute-0 ovn_controller[94850]: 2026-01-22T22:19:38Z|00087|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 up in Southbound
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.055 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:a3:f5 10.100.0.6'], port_security=['fa:16:3e:01:a3:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '21', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=580dc508-636a-420e-aed2-8efd9dccace5) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.057 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 580dc508-636a-420e-aed2-8efd9dccace5 in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea bound to our chassis
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.058 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Releasing lock "refresh_cache-be44208f-27c9-4da7-a5bc-5c2583fdb393" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.059 182729 DEBUG nova.compute.manager [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.060 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.061 182729 INFO nova.virt.libvirt.driver [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Deleting instance files /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0_del
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.063 182729 INFO nova.virt.libvirt.driver [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Deletion of /var/lib/nova/instances/1c330292-7fe1-4a26-a2d0-85ee27c734f0_del complete
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.078 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc54491-641d-4f94-b1dc-c749fce0e105]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.079 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap698e77c5-f1 in ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.084 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap698e77c5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.084 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a21d51de-5194-48d2-8927-c171d0612499]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.086 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[936fdfea-c0d6-49c7-b616-962c0ea047ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.109 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[753f2f2a-331d-464b-a03f-d45c5b1ed851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 22 22:19:38 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001a.scope: Consumed 12.330s CPU time.
Jan 22 22:19:38 compute-0 systemd-machined[154006]: Machine qemu-10-instance-0000001a terminated.
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.133 182729 INFO nova.compute.manager [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.133 182729 DEBUG oslo.service.loopingcall [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.134 182729 DEBUG nova.compute.manager [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.135 182729 DEBUG nova.network.neutron [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.140 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5a027d7f-36b3-4350-b6e4-b8841f3683e1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.185 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[898b026d-809f-45f6-aa3b-fd353290693a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.196 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[689ba9c0-265d-4e88-af4f-8b0ca2a596a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 systemd-udevd[214441]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:19:38 compute-0 NetworkManager[54954]: <info>  [1769120378.1988] manager: (tap698e77c5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.217 182729 INFO nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Post operation of migration started
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.239 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9056179a-fc19-4536-b50e-eb441356001c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.244 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b2627af0-973c-478c-ba04-8dead352cacd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.274 182729 DEBUG nova.network.neutron [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:19:38 compute-0 NetworkManager[54954]: <info>  [1769120378.2891] device (tap698e77c5-f0): carrier: link connected
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.295 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a803acf3-4237-4b91-aa9f-7033aae0eebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.301 182729 DEBUG nova.network.neutron [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.318 182729 INFO nova.compute.manager [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Took 0.18 seconds to deallocate network for instance.
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.323 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7ab6f1-e232-4505-8361-f946442977f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401589, 'reachable_time': 22080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214479, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.335 182729 INFO nova.virt.libvirt.driver [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Instance destroyed successfully.
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.335 182729 DEBUG nova.objects.instance [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'resources' on Instance uuid be44208f-27c9-4da7-a5bc-5c2583fdb393 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.341 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7bfdd5-02f1-470e-9dbb-f33c486b59b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:3733'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401589, 'tstamp': 401589}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214484, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.361 182729 INFO nova.virt.libvirt.driver [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Deleting instance files /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393_del
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.361 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b45a76e6-4a6c-42ab-aa46-196ede3b4fbd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401589, 'reachable_time': 22080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214485, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.362 182729 INFO nova.virt.libvirt.driver [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Deletion of /var/lib/nova/instances/be44208f-27c9-4da7-a5bc-5c2583fdb393_del complete
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.403 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[157ca00f-db29-450e-a3f2-d72301b1fd9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.447 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.448 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.462 182729 INFO nova.compute.manager [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.463 182729 DEBUG oslo.service.loopingcall [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.463 182729 DEBUG nova.compute.manager [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.463 182729 DEBUG nova.network.neutron [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.494 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[856df09a-427a-470f-b047-30cb2451b4b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.496 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.497 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.497 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap698e77c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.500 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:38 compute-0 kernel: tap698e77c5-f0: entered promiscuous mode
Jan 22 22:19:38 compute-0 NetworkManager[54954]: <info>  [1769120378.5008] manager: (tap698e77c5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.502 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.505 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap698e77c5-f0, col_values=(('external_ids', {'iface-id': 'a18a2be2-f1a5-44ce-96ac-2c546dab3eef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:38 compute-0 ovn_controller[94850]: 2026-01-22T22:19:38Z|00088|binding|INFO|Releasing lport a18a2be2-f1a5-44ce-96ac-2c546dab3eef from this chassis (sb_readonly=0)
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.508 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.525 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.527 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.528 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c5d051-929d-4729-a419-1bcff53717a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.529 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:19:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:38.531 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'env', 'PROCESS_TAG=haproxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/698e77c5-fce6-47a5-b6e3-f4c56da226ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.541 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.571 182729 DEBUG nova.compute.provider_tree [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.587 182729 DEBUG nova.scheduler.client.report [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.614 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.647 182729 INFO nova.scheduler.client.report [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Deleted allocations for instance 1c330292-7fe1-4a26-a2d0-85ee27c734f0
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.653 182729 DEBUG nova.network.neutron [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.663 182729 DEBUG nova.network.neutron [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.684 182729 INFO nova.compute.manager [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Took 0.22 seconds to deallocate network for instance.
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.689 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.689 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.689 182729 DEBUG nova.network.neutron [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.718 182729 DEBUG oslo_concurrency.lockutils [None req-666430cf-a66b-4637-992e-1ccbada7cfc3 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "1c330292-7fe1-4a26-a2d0-85ee27c734f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.760 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.761 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.857 182729 DEBUG nova.compute.provider_tree [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.872 182729 DEBUG nova.scheduler.client.report [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.897 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:38 compute-0 nova_compute[182725]: 2026-01-22 22:19:38.951 182729 INFO nova.scheduler.client.report [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Deleted allocations for instance be44208f-27c9-4da7-a5bc-5c2583fdb393
Jan 22 22:19:38 compute-0 podman[214516]: 2026-01-22 22:19:38.9806285 +0000 UTC m=+0.068743181 container create 5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 22:19:39 compute-0 systemd[1]: Started libpod-conmon-5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8.scope.
Jan 22 22:19:39 compute-0 nova_compute[182725]: 2026-01-22 22:19:39.036 182729 DEBUG oslo_concurrency.lockutils [None req-d36a32da-be75-4055-ad83-937fe68b9787 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "be44208f-27c9-4da7-a5bc-5c2583fdb393" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:39 compute-0 podman[214516]: 2026-01-22 22:19:38.945336162 +0000 UTC m=+0.033450863 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:19:39 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:19:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/859ec6cf10c9e8a8d79cb1286fe9a9ec3730c8f7d29da7c1e4ddcbdfb4090dd2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:19:39 compute-0 podman[214516]: 2026-01-22 22:19:39.071680142 +0000 UTC m=+0.159794853 container init 5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:19:39 compute-0 podman[214516]: 2026-01-22 22:19:39.078836082 +0000 UTC m=+0.166950773 container start 5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:19:39 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214531]: [NOTICE]   (214535) : New worker (214537) forked
Jan 22 22:19:39 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214531]: [NOTICE]   (214535) : Loading success.
Jan 22 22:19:40 compute-0 nova_compute[182725]: 2026-01-22 22:19:40.363 182729 DEBUG nova.network.neutron [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:40 compute-0 nova_compute[182725]: 2026-01-22 22:19:40.427 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:40 compute-0 nova_compute[182725]: 2026-01-22 22:19:40.523 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:40 compute-0 nova_compute[182725]: 2026-01-22 22:19:40.524 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:40 compute-0 nova_compute[182725]: 2026-01-22 22:19:40.524 182729 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:40 compute-0 nova_compute[182725]: 2026-01-22 22:19:40.530 182729 INFO nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 22 22:19:40 compute-0 virtqemud[182297]: Domain id=12 name='instance-00000016' uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed is tainted: custom-monitor
Jan 22 22:19:41 compute-0 podman[214546]: 2026-01-22 22:19:41.129022567 +0000 UTC m=+0.063820148 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:19:41 compute-0 nova_compute[182725]: 2026-01-22 22:19:41.544 182729 INFO nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 22 22:19:42 compute-0 nova_compute[182725]: 2026-01-22 22:19:42.371 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:42 compute-0 nova_compute[182725]: 2026-01-22 22:19:42.550 182729 INFO nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 22 22:19:42 compute-0 nova_compute[182725]: 2026-01-22 22:19:42.556 182729 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:42 compute-0 nova_compute[182725]: 2026-01-22 22:19:42.577 182729 DEBUG nova.objects.instance [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 22:19:43 compute-0 nova_compute[182725]: 2026-01-22 22:19:43.545 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:47 compute-0 nova_compute[182725]: 2026-01-22 22:19:47.376 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:48 compute-0 nova_compute[182725]: 2026-01-22 22:19:48.547 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:49 compute-0 nova_compute[182725]: 2026-01-22 22:19:49.306 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120374.3050287, c20e5ebc-adf4-4b83-af3a-908b9b574a25 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:49 compute-0 nova_compute[182725]: 2026-01-22 22:19:49.306 182729 INFO nova.compute.manager [-] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] VM Stopped (Lifecycle Event)
Jan 22 22:19:49 compute-0 nova_compute[182725]: 2026-01-22 22:19:49.327 182729 DEBUG nova.compute.manager [None req-57c7d0fe-caa5-480f-862a-518ff790e091 - - - - - -] [instance: c20e5ebc-adf4-4b83-af3a-908b9b574a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.225 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.225 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.226 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.226 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.226 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.241 182729 INFO nova.compute.manager [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Terminating instance
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.251 182729 DEBUG nova.compute.manager [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:19:51 compute-0 kernel: tap09a74418-97 (unregistering): left promiscuous mode
Jan 22 22:19:51 compute-0 NetworkManager[54954]: <info>  [1769120391.2789] device (tap09a74418-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:19:51 compute-0 ovn_controller[94850]: 2026-01-22T22:19:51Z|00089|binding|INFO|Releasing lport 09a74418-977b-4aa6-86a6-2d84a3cb143a from this chassis (sb_readonly=0)
Jan 22 22:19:51 compute-0 ovn_controller[94850]: 2026-01-22T22:19:51Z|00090|binding|INFO|Setting lport 09a74418-977b-4aa6-86a6-2d84a3cb143a down in Southbound
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.282 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 ovn_controller[94850]: 2026-01-22T22:19:51Z|00091|binding|INFO|Removing iface tap09a74418-97 ovn-installed in OVS
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.291 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:3d:22 10.100.0.5'], port_security=['fa:16:3e:59:3d:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '008af030-d785-4936-871a-4d52ccebc8f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=09a74418-977b-4aa6-86a6-2d84a3cb143a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.294 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 09a74418-977b-4aa6-86a6-2d84a3cb143a in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c unbound from our chassis
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.297 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.299 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bd6deea5-730e-4aae-af2e-5d47156cea5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.301 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c namespace which is not needed anymore
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.310 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 22 22:19:51 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000013.scope: Consumed 15.467s CPU time.
Jan 22 22:19:51 compute-0 systemd-machined[154006]: Machine qemu-7-instance-00000013 terminated.
Jan 22 22:19:51 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [NOTICE]   (213623) : haproxy version is 2.8.14-c23fe91
Jan 22 22:19:51 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [NOTICE]   (213623) : path to executable is /usr/sbin/haproxy
Jan 22 22:19:51 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [WARNING]  (213623) : Exiting Master process...
Jan 22 22:19:51 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [WARNING]  (213623) : Exiting Master process...
Jan 22 22:19:51 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [ALERT]    (213623) : Current worker (213625) exited with code 143 (Terminated)
Jan 22 22:19:51 compute-0 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213619]: [WARNING]  (213623) : All workers exited. Exiting... (0)
Jan 22 22:19:51 compute-0 systemd[1]: libpod-53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04.scope: Deactivated successfully.
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.477 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 podman[214597]: 2026-01-22 22:19:51.478676051 +0000 UTC m=+0.056688518 container died 53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.489 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04-userdata-shm.mount: Deactivated successfully.
Jan 22 22:19:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-fc8012d210d5001d0b8e47f8cfa4ccf4fd6427e37b92900a149199b63c337505-merged.mount: Deactivated successfully.
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.541 182729 INFO nova.virt.libvirt.driver [-] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Instance destroyed successfully.
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.542 182729 DEBUG nova.objects.instance [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'resources' on Instance uuid 008af030-d785-4936-871a-4d52ccebc8f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:51 compute-0 podman[214597]: 2026-01-22 22:19:51.543923263 +0000 UTC m=+0.121935670 container cleanup 53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.561 182729 DEBUG nova.virt.libvirt.vif [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:18:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1904910074',display_name='tempest-ServersAdminTestJSON-server-1904910074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1904910074',id=19,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-tyub60qk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:18:40Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=008af030-d785-4936-871a-4d52ccebc8f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.562 182729 DEBUG nova.network.os_vif_util [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "address": "fa:16:3e:59:3d:22", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a74418-97", "ovs_interfaceid": "09a74418-977b-4aa6-86a6-2d84a3cb143a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.563 182729 DEBUG nova.network.os_vif_util [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:3d:22,bridge_name='br-int',has_traffic_filtering=True,id=09a74418-977b-4aa6-86a6-2d84a3cb143a,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a74418-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.563 182729 DEBUG os_vif [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:3d:22,bridge_name='br-int',has_traffic_filtering=True,id=09a74418-977b-4aa6-86a6-2d84a3cb143a,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a74418-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.566 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.566 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09a74418-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.568 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 systemd[1]: libpod-conmon-53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04.scope: Deactivated successfully.
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.570 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.573 182729 INFO os_vif [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:3d:22,bridge_name='br-int',has_traffic_filtering=True,id=09a74418-977b-4aa6-86a6-2d84a3cb143a,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a74418-97')
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.574 182729 INFO nova.virt.libvirt.driver [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Deleting instance files /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8_del
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.575 182729 INFO nova.virt.libvirt.driver [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Deletion of /var/lib/nova/instances/008af030-d785-4936-871a-4d52ccebc8f8_del complete
Jan 22 22:19:51 compute-0 sshd-session[214570]: Invalid user admin from 45.148.10.121 port 35520
Jan 22 22:19:51 compute-0 podman[214638]: 2026-01-22 22:19:51.629407314 +0000 UTC m=+0.052293517 container remove 53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.639 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e273fbf9-c81d-4dd3-8847-45d71701ac3e]: (4, ('Thu Jan 22 10:19:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c (53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04)\n53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04\nThu Jan 22 10:19:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c (53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04)\n53a30c37461e017a530ca2375c38b083f11d29ec22e32489252392e2b7004b04\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.642 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[953ac56b-57dc-4653-b014-4e2e23f2510c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.644 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:51 compute-0 kernel: tap19dd816f-60: left promiscuous mode
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.647 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.666 182729 DEBUG nova.compute.manager [req-a048f83f-514e-425f-bafb-e5fc8ee99483 req-87112dc8-8a56-4c33-94b8-0f65a4fbb069 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received event network-vif-unplugged-09a74418-977b-4aa6-86a6-2d84a3cb143a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:51 compute-0 sshd-session[214570]: Connection closed by invalid user admin 45.148.10.121 port 35520 [preauth]
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.667 182729 DEBUG oslo_concurrency.lockutils [req-a048f83f-514e-425f-bafb-e5fc8ee99483 req-87112dc8-8a56-4c33-94b8-0f65a4fbb069 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.667 182729 DEBUG oslo_concurrency.lockutils [req-a048f83f-514e-425f-bafb-e5fc8ee99483 req-87112dc8-8a56-4c33-94b8-0f65a4fbb069 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.668 182729 DEBUG oslo_concurrency.lockutils [req-a048f83f-514e-425f-bafb-e5fc8ee99483 req-87112dc8-8a56-4c33-94b8-0f65a4fbb069 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.668 182729 DEBUG nova.compute.manager [req-a048f83f-514e-425f-bafb-e5fc8ee99483 req-87112dc8-8a56-4c33-94b8-0f65a4fbb069 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] No waiting events found dispatching network-vif-unplugged-09a74418-977b-4aa6-86a6-2d84a3cb143a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.669 182729 DEBUG nova.compute.manager [req-a048f83f-514e-425f-bafb-e5fc8ee99483 req-87112dc8-8a56-4c33-94b8-0f65a4fbb069 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received event network-vif-unplugged-09a74418-977b-4aa6-86a6-2d84a3cb143a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.671 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.678 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[be1cf9a3-1c49-420b-bec8-b6e972b38039]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.680 182729 INFO nova.compute.manager [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.680 182729 DEBUG oslo.service.loopingcall [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.681 182729 DEBUG nova.compute.manager [-] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:19:51 compute-0 nova_compute[182725]: 2026-01-22 22:19:51.681 182729 DEBUG nova.network.neutron [-] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.702 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[38961e76-fff5-41ca-a9f3-e1128680e0b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.704 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f61180-c7dd-4bfc-b65c-5a9ff8a97496]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.733 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b972b33f-d9ef-47b9-950a-00432ea251e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395754, 'reachable_time': 30839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214658, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d19dd816f\x2d669a\x2d4bda\x2db508\x2da3ddcd4c2d7c.mount: Deactivated successfully.
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.740 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:19:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:51.741 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[c8037a17-994f-47d6-9c6d-c11b86449f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.340 182729 DEBUG nova.network.neutron [-] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.358 182729 INFO nova.compute.manager [-] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Took 0.68 seconds to deallocate network for instance.
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.436 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.437 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.514 182729 DEBUG nova.compute.provider_tree [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.529 182729 DEBUG nova.scheduler.client.report [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.549 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.571 182729 INFO nova.scheduler.client.report [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Deleted allocations for instance 008af030-d785-4936-871a-4d52ccebc8f8
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.658 182729 DEBUG oslo_concurrency.lockutils [None req-3c75e5c1-548c-4dae-b5d8-e13aa6c07c50 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.663 182729 DEBUG nova.compute.manager [req-1fd0888a-65d4-42e5-a743-bf494d604204 req-60549b98-68c5-4545-a378-8b8fefe1b55f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received event network-vif-deleted-09a74418-977b-4aa6-86a6-2d84a3cb143a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.846 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.872 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Triggering sync for uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.873 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.874 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:52 compute-0 nova_compute[182725]: 2026-01-22 22:19:52.897 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.038 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120378.0361993, 1c330292-7fe1-4a26-a2d0-85ee27c734f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.038 182729 INFO nova.compute.manager [-] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] VM Stopped (Lifecycle Event)
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.059 182729 DEBUG nova.compute.manager [None req-f041b5db-920d-4134-a007-31b2545870bf - - - - - -] [instance: 1c330292-7fe1-4a26-a2d0-85ee27c734f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.120 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.121 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.138 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.233 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.233 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.242 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.242 182729 INFO nova.compute.claims [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.330 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120378.329534, be44208f-27c9-4da7-a5bc-5c2583fdb393 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.331 182729 INFO nova.compute.manager [-] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] VM Stopped (Lifecycle Event)
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.365 182729 DEBUG nova.compute.manager [None req-c82da704-dae4-48a1-8f07-5431ea262ea2 - - - - - -] [instance: be44208f-27c9-4da7-a5bc-5c2583fdb393] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.443 182729 DEBUG nova.compute.provider_tree [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.472 182729 DEBUG nova.scheduler.client.report [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.500 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.501 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.552 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.554 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.554 182729 DEBUG nova.network.neutron [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.569 182729 INFO nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.583 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.691 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.693 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.693 182729 INFO nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Creating image(s)
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.694 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.694 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.695 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.711 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.793 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.795 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.796 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.827 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.863 182729 DEBUG nova.compute.manager [req-31613f51-d745-4709-829a-7c77695c348a req-3eebf304-8b1c-46ee-91d9-992d646fc276 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received event network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.864 182729 DEBUG oslo_concurrency.lockutils [req-31613f51-d745-4709-829a-7c77695c348a req-3eebf304-8b1c-46ee-91d9-992d646fc276 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "008af030-d785-4936-871a-4d52ccebc8f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.864 182729 DEBUG oslo_concurrency.lockutils [req-31613f51-d745-4709-829a-7c77695c348a req-3eebf304-8b1c-46ee-91d9-992d646fc276 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.864 182729 DEBUG oslo_concurrency.lockutils [req-31613f51-d745-4709-829a-7c77695c348a req-3eebf304-8b1c-46ee-91d9-992d646fc276 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "008af030-d785-4936-871a-4d52ccebc8f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.864 182729 DEBUG nova.compute.manager [req-31613f51-d745-4709-829a-7c77695c348a req-3eebf304-8b1c-46ee-91d9-992d646fc276 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] No waiting events found dispatching network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.864 182729 WARNING nova.compute.manager [req-31613f51-d745-4709-829a-7c77695c348a req-3eebf304-8b1c-46ee-91d9-992d646fc276 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Received unexpected event network-vif-plugged-09a74418-977b-4aa6-86a6-2d84a3cb143a for instance with vm_state deleted and task_state None.
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.914 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:53 compute-0 nova_compute[182725]: 2026-01-22 22:19:53.915 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.112 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk 1073741824" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.115 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.115 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.148 182729 DEBUG nova.policy [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.192 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.193 182729 DEBUG nova.virt.disk.api [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Checking if we can resize image /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.194 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:54 compute-0 podman[214668]: 2026-01-22 22:19:54.197019479 +0000 UTC m=+0.121309444 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.257 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.259 182729 DEBUG nova.virt.disk.api [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Cannot resize image /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.259 182729 DEBUG nova.objects.instance [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lazy-loading 'migration_context' on Instance uuid b484b5f7-0814-4161-b492-633788f2961f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.288 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.289 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Ensure instance console log exists: /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.289 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.289 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:54 compute-0 nova_compute[182725]: 2026-01-22 22:19:54.289 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:55 compute-0 nova_compute[182725]: 2026-01-22 22:19:55.863 182729 DEBUG nova.network.neutron [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Successfully updated port: 7e88b712-bef4-4434-b405-04af2a2d3d0f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:19:55 compute-0 nova_compute[182725]: 2026-01-22 22:19:55.878 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:55 compute-0 nova_compute[182725]: 2026-01-22 22:19:55.878 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquired lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:55 compute-0 nova_compute[182725]: 2026-01-22 22:19:55.879 182729 DEBUG nova.network.neutron [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:19:56 compute-0 nova_compute[182725]: 2026-01-22 22:19:56.065 182729 DEBUG nova.network.neutron [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:19:56 compute-0 nova_compute[182725]: 2026-01-22 22:19:56.570 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:56 compute-0 nova_compute[182725]: 2026-01-22 22:19:56.891 182729 DEBUG nova.compute.manager [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-changed-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:56 compute-0 nova_compute[182725]: 2026-01-22 22:19:56.892 182729 DEBUG nova.compute.manager [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Refreshing instance network info cache due to event network-changed-7e88b712-bef4-4434-b405-04af2a2d3d0f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:19:56 compute-0 nova_compute[182725]: 2026-01-22 22:19:56.892 182729 DEBUG oslo_concurrency.lockutils [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.394 182729 DEBUG nova.network.neutron [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating instance_info_cache with network_info: [{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.415 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Releasing lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.416 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Instance network_info: |[{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.418 182729 DEBUG oslo_concurrency.lockutils [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.418 182729 DEBUG nova.network.neutron [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Refreshing network info cache for port 7e88b712-bef4-4434-b405-04af2a2d3d0f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.421 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Start _get_guest_xml network_info=[{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.427 182729 WARNING nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.435 182729 DEBUG nova.virt.libvirt.host [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.435 182729 DEBUG nova.virt.libvirt.host [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.438 182729 DEBUG nova.virt.libvirt.host [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.439 182729 DEBUG nova.virt.libvirt.host [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.440 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.440 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.441 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.441 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.441 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.441 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.442 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.442 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.442 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.442 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.443 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.443 182729 DEBUG nova.virt.hardware [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.447 182729 DEBUG nova.virt.libvirt.vif [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1962651622',display_name='tempest-LiveMigrationTest-server-1962651622',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1962651622',id=30,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-rvizwmjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:53Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=b484b5f7-0814-4161-b492-633788f2961f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.447 182729 DEBUG nova.network.os_vif_util [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converting VIF {"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.448 182729 DEBUG nova.network.os_vif_util [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.449 182729 DEBUG nova.objects.instance [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lazy-loading 'pci_devices' on Instance uuid b484b5f7-0814-4161-b492-633788f2961f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.464 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <uuid>b484b5f7-0814-4161-b492-633788f2961f</uuid>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <name>instance-0000001e</name>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <nova:name>tempest-LiveMigrationTest-server-1962651622</nova:name>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:19:57</nova:creationTime>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:user uuid="06b4b3807dc64d83b8bfbbf0c4d31d77">tempest-LiveMigrationTest-652633664-project-member</nova:user>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:project uuid="9ead4241c55147dcbe136a6d6a69a60f">tempest-LiveMigrationTest-652633664</nova:project>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         <nova:port uuid="7e88b712-bef4-4434-b405-04af2a2d3d0f">
Jan 22 22:19:57 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <system>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <entry name="serial">b484b5f7-0814-4161-b492-633788f2961f</entry>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <entry name="uuid">b484b5f7-0814-4161-b492-633788f2961f</entry>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </system>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <os>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   </os>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <features>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   </features>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:ad:07:e9"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <target dev="tap7e88b712-be"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/console.log" append="off"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <video>
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </video>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:19:57 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:19:57 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:19:57 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:19:57 compute-0 nova_compute[182725]: </domain>
Jan 22 22:19:57 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.466 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Preparing to wait for external event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.466 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.467 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.467 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.469 182729 DEBUG nova.virt.libvirt.vif [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1962651622',display_name='tempest-LiveMigrationTest-server-1962651622',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1962651622',id=30,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-rvizwmjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:53Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=b484b5f7-0814-4161-b492-633788f2961f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.469 182729 DEBUG nova.network.os_vif_util [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converting VIF {"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.470 182729 DEBUG nova.network.os_vif_util [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.470 182729 DEBUG os_vif [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.471 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.472 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.472 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.477 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.478 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e88b712-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.478 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e88b712-be, col_values=(('external_ids', {'iface-id': '7e88b712-bef4-4434-b405-04af2a2d3d0f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:07:e9', 'vm-uuid': 'b484b5f7-0814-4161-b492-633788f2961f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:57 compute-0 NetworkManager[54954]: <info>  [1769120397.4819] manager: (tap7e88b712-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.482 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.483 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.489 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.489 182729 INFO os_vif [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be')
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.559 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.560 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.560 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] No VIF found with MAC fa:16:3e:ad:07:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:19:57 compute-0 nova_compute[182725]: 2026-01-22 22:19:57.561 182729 INFO nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Using config drive
Jan 22 22:19:57 compute-0 podman[214697]: 2026-01-22 22:19:57.62033051 +0000 UTC m=+0.090232922 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:19:57 compute-0 podman[214698]: 2026-01-22 22:19:57.622474234 +0000 UTC m=+0.093244848 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.041 182729 INFO nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Creating config drive at /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.050 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcdr34dq2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.179 182729 DEBUG oslo_concurrency.processutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcdr34dq2" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:19:58 compute-0 kernel: tap7e88b712-be: entered promiscuous mode
Jan 22 22:19:58 compute-0 NetworkManager[54954]: <info>  [1769120398.2606] manager: (tap7e88b712-be): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00092|binding|INFO|Claiming lport 7e88b712-bef4-4434-b405-04af2a2d3d0f for this chassis.
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00093|binding|INFO|7e88b712-bef4-4434-b405-04af2a2d3d0f: Claiming fa:16:3e:ad:07:e9 10.100.0.14
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00094|binding|INFO|Claiming lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 for this chassis.
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00095|binding|INFO|a87b1642-1ac3-4b35-809d-79c74a2f4e13: Claiming fa:16:3e:ac:a8:b5 19.80.0.59
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.265 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.289 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:07:e9 10.100.0.14'], port_security=['fa:16:3e:ad:07:e9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-850980191', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-850980191', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=7e88b712-bef4-4434-b405-04af2a2d3d0f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.291 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:a8:b5 19.80.0.59'], port_security=['fa:16:3e:ac:a8:b5 19.80.0.59'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['7e88b712-bef4-4434-b405-04af2a2d3d0f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1993138940', 'neutron:cidrs': '19.80.0.59/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1993138940', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=529a2a70-69a8-4f19-a951-f1c58852ecd0, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a87b1642-1ac3-4b35-809d-79c74a2f4e13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:19:58 compute-0 systemd-udevd[214757]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.292 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 7e88b712-bef4-4434-b405-04af2a2d3d0f in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea bound to our chassis
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.294 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.306 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 NetworkManager[54954]: <info>  [1769120398.3122] device (tap7e88b712-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00096|binding|INFO|Setting lport 7e88b712-bef4-4434-b405-04af2a2d3d0f ovn-installed in OVS
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00097|binding|INFO|Setting lport 7e88b712-bef4-4434-b405-04af2a2d3d0f up in Southbound
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00098|binding|INFO|Setting lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 up in Southbound
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.312 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 NetworkManager[54954]: <info>  [1769120398.3142] device (tap7e88b712-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.315 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[334672d6-18f2-4078-8754-510752345653]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 systemd-machined[154006]: New machine qemu-13-instance-0000001e.
Jan 22 22:19:58 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000001e.
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.352 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a50a1bf1-330c-452e-a24b-c5810f53755a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.356 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6716ae7d-36ec-4e24-8c9a-200c370c6771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.392 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b7d80c-f2b0-4455-8173-66bb2f995263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.413 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e8b370-be00-4497-99e7-341d8b451a8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401589, 'reachable_time': 22080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214773, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.439 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[42c60aed-d8a3-4b5d-8461-0e5bfd52896d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap698e77c5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401604, 'tstamp': 401604}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214775, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap698e77c5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401609, 'tstamp': 401609}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214775, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.441 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.444 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.446 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap698e77c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.447 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.447 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap698e77c5-f0, col_values=(('external_ids', {'iface-id': 'a18a2be2-f1a5-44ce-96ac-2c546dab3eef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.448 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.450 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a87b1642-1ac3-4b35-809d-79c74a2f4e13 in datapath 75073b6a-f711-4d82-9e11-07cd8a1d16e2 unbound from our chassis
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.452 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 75073b6a-f711-4d82-9e11-07cd8a1d16e2
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.468 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5fcc68d5-5e91-4d14-b35f-14a5d8c81c18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.469 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap75073b6a-f1 in ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.471 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap75073b6a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.471 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5123cfed-fadf-4259-a513-1277648cf3c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.472 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e38e14-3d76-4e40-b91b-eb8b61d04a75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.496 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[1df3eef8-54be-4e38-861a-53847a05a854]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.523 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7288b527-96f6-4212-a8b6-abd53a0dca14]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.553 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[71f9cee9-a177-4978-8119-287401604ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.555 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 systemd-udevd[214763]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:19:58 compute-0 NetworkManager[54954]: <info>  [1769120398.5666] manager: (tap75073b6a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00099|binding|INFO|Releasing lport a18a2be2-f1a5-44ce-96ac-2c546dab3eef from this chassis (sb_readonly=0)
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.568 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6018d89d-8037-440f-9b1e-4344309f4536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.610 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b3383312-cb3f-4e1f-bdc3-706bc6305332]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.614 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[414c4bc8-4fb7-446a-852b-70ad57859b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.621 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 NetworkManager[54954]: <info>  [1769120398.6361] device (tap75073b6a-f0): carrier: link connected
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.640 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[60db238e-0a7c-49b0-9d09-06cf13df794b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.661 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[772747bb-05ed-4b40-8e74-b7621704614d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75073b6a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:2f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403623, 'reachable_time': 16007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214802, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.677 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb9c723-6cc8-44c2-adb7-de021248566d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:2fb5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403623, 'tstamp': 403623}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214807, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.693 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd6f76c-0776-4977-a4a9-6e798c5e93a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75073b6a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:2f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403623, 'reachable_time': 16007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214808, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.724 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fbcc0820-c5a1-4035-a116-fff4d3225e7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.762 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120398.7611487, b484b5f7-0814-4161-b492-633788f2961f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.762 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Started (Lifecycle Event)
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.784 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.788 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120398.7644014, b484b5f7-0814-4161-b492-633788f2961f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.789 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Paused (Lifecycle Event)
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.796 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[99038b8c-5047-42da-9e0f-241162acb857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.798 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75073b6a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.798 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.799 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75073b6a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.800 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 NetworkManager[54954]: <info>  [1769120398.8017] manager: (tap75073b6a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 22 22:19:58 compute-0 kernel: tap75073b6a-f0: entered promiscuous mode
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.805 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.805 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap75073b6a-f0, col_values=(('external_ids', {'iface-id': '65e3ee7d-2176-49c3-aeee-08b035ff0bbf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.806 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 ovn_controller[94850]: 2026-01-22T22:19:58Z|00100|binding|INFO|Releasing lport 65e3ee7d-2176-49c3-aeee-08b035ff0bbf from this chassis (sb_readonly=0)
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.813 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.818 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.819 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/75073b6a-f711-4d82-9e11-07cd8a1d16e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/75073b6a-f711-4d82-9e11-07cd8a1d16e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.820 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef7c2bc-3162-4bf5-b80c-d6dc9803a551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.821 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-75073b6a-f711-4d82-9e11-07cd8a1d16e2
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/75073b6a-f711-4d82-9e11-07cd8a1d16e2.pid.haproxy
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 75073b6a-f711-4d82-9e11-07cd8a1d16e2
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:19:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:19:58.822 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'env', 'PROCESS_TAG=haproxy-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/75073b6a-f711-4d82-9e11-07cd8a1d16e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.831 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.892 182729 DEBUG nova.network.neutron [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updated VIF entry in instance network info cache for port 7e88b712-bef4-4434-b405-04af2a2d3d0f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.893 182729 DEBUG nova.network.neutron [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating instance_info_cache with network_info: [{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:19:58 compute-0 nova_compute[182725]: 2026-01-22 22:19:58.911 182729 DEBUG oslo_concurrency.lockutils [req-3aff642b-c37f-4a4c-b913-2e3558e59e86 req-af565b11-f650-4168-a45b-ee7eb24a7bb9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.011 182729 DEBUG nova.compute.manager [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.012 182729 DEBUG oslo_concurrency.lockutils [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.012 182729 DEBUG oslo_concurrency.lockutils [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.013 182729 DEBUG oslo_concurrency.lockutils [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.013 182729 DEBUG nova.compute.manager [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Processing event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.014 182729 DEBUG nova.compute.manager [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.014 182729 DEBUG oslo_concurrency.lockutils [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.014 182729 DEBUG oslo_concurrency.lockutils [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.015 182729 DEBUG oslo_concurrency.lockutils [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.015 182729 DEBUG nova.compute.manager [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.015 182729 WARNING nova.compute.manager [req-732904ac-d243-48ad-9f35-485481f42c38 req-ea6b56bf-0951-4e0b-a833-aef28c38b8b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received unexpected event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with vm_state building and task_state spawning.
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.016 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.021 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120399.020715, b484b5f7-0814-4161-b492-633788f2961f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.021 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Resumed (Lifecycle Event)
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.041 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.042 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.046 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.048 182729 INFO nova.virt.libvirt.driver [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] Instance spawned successfully.
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.049 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.068 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.084 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.085 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.086 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.086 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.087 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.088 182729 DEBUG nova.virt.libvirt.driver [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.162 182729 INFO nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Took 5.47 seconds to spawn the instance on the hypervisor.
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.163 182729 DEBUG nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.264 182729 INFO nova.compute.manager [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Took 6.06 seconds to build instance.
Jan 22 22:19:59 compute-0 podman[214841]: 2026-01-22 22:19:59.282627103 +0000 UTC m=+0.075737087 container create 90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 22:19:59 compute-0 nova_compute[182725]: 2026-01-22 22:19:59.290 182729 DEBUG oslo_concurrency.lockutils [None req-a477c3e0-5743-4c6c-8b3b-b7c8b97d4419 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:19:59 compute-0 systemd[1]: Started libpod-conmon-90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625.scope.
Jan 22 22:19:59 compute-0 podman[214841]: 2026-01-22 22:19:59.236490092 +0000 UTC m=+0.029600106 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:19:59 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:19:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82f4eaa72c1cdb5c949454a9e3954e231d0403b18da579c66c78a9f7314213da/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:19:59 compute-0 podman[214841]: 2026-01-22 22:19:59.380525546 +0000 UTC m=+0.173635530 container init 90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:19:59 compute-0 podman[214841]: 2026-01-22 22:19:59.385923882 +0000 UTC m=+0.179033846 container start 90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:19:59 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [NOTICE]   (214860) : New worker (214862) forked
Jan 22 22:19:59 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [NOTICE]   (214860) : Loading success.
Jan 22 22:20:02 compute-0 nova_compute[182725]: 2026-01-22 22:20:02.337 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Check if temp file /var/lib/nova/instances/tmp73gi6c2o exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 22 22:20:02 compute-0 nova_compute[182725]: 2026-01-22 22:20:02.341 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:02 compute-0 nova_compute[182725]: 2026-01-22 22:20:02.425 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:02 compute-0 nova_compute[182725]: 2026-01-22 22:20:02.427 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:02 compute-0 nova_compute[182725]: 2026-01-22 22:20:02.482 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:02 compute-0 nova_compute[182725]: 2026-01-22 22:20:02.522 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:02 compute-0 nova_compute[182725]: 2026-01-22 22:20:02.523 182729 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp73gi6c2o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b484b5f7-0814-4161-b492-633788f2961f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 22 22:20:03 compute-0 nova_compute[182725]: 2026-01-22 22:20:03.555 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:03 compute-0 nova_compute[182725]: 2026-01-22 22:20:03.735 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:03 compute-0 nova_compute[182725]: 2026-01-22 22:20:03.811 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:03 compute-0 nova_compute[182725]: 2026-01-22 22:20:03.812 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:03 compute-0 nova_compute[182725]: 2026-01-22 22:20:03.868 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:06 compute-0 sshd-session[214886]: Accepted publickey for nova from 192.168.122.102 port 47266 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:20:06 compute-0 systemd-logind[801]: New session 32 of user nova.
Jan 22 22:20:06 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 22:20:06 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 22:20:06 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 22:20:06 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 22:20:06 compute-0 systemd[214890]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:20:06 compute-0 nova_compute[182725]: 2026-01-22 22:20:06.541 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120391.5378888, 008af030-d785-4936-871a-4d52ccebc8f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:06 compute-0 nova_compute[182725]: 2026-01-22 22:20:06.541 182729 INFO nova.compute.manager [-] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] VM Stopped (Lifecycle Event)
Jan 22 22:20:06 compute-0 nova_compute[182725]: 2026-01-22 22:20:06.561 182729 DEBUG nova.compute.manager [None req-661f97b1-514b-40c0-b310-11a342919920 - - - - - -] [instance: 008af030-d785-4936-871a-4d52ccebc8f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:06 compute-0 systemd[214890]: Queued start job for default target Main User Target.
Jan 22 22:20:06 compute-0 systemd[214890]: Created slice User Application Slice.
Jan 22 22:20:06 compute-0 systemd[214890]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:20:06 compute-0 systemd[214890]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:20:06 compute-0 systemd[214890]: Reached target Paths.
Jan 22 22:20:06 compute-0 systemd[214890]: Reached target Timers.
Jan 22 22:20:06 compute-0 systemd[214890]: Starting D-Bus User Message Bus Socket...
Jan 22 22:20:06 compute-0 systemd[214890]: Starting Create User's Volatile Files and Directories...
Jan 22 22:20:06 compute-0 systemd[214890]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:20:06 compute-0 systemd[214890]: Reached target Sockets.
Jan 22 22:20:06 compute-0 systemd[214890]: Finished Create User's Volatile Files and Directories.
Jan 22 22:20:06 compute-0 systemd[214890]: Reached target Basic System.
Jan 22 22:20:06 compute-0 systemd[214890]: Reached target Main User Target.
Jan 22 22:20:06 compute-0 systemd[214890]: Startup finished in 156ms.
Jan 22 22:20:06 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 22:20:06 compute-0 systemd[1]: Started Session 32 of User nova.
Jan 22 22:20:06 compute-0 sshd-session[214886]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:20:06 compute-0 sshd-session[214905]: Received disconnect from 192.168.122.102 port 47266:11: disconnected by user
Jan 22 22:20:06 compute-0 sshd-session[214905]: Disconnected from user nova 192.168.122.102 port 47266
Jan 22 22:20:06 compute-0 sshd-session[214886]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:20:06 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Jan 22 22:20:06 compute-0 systemd-logind[801]: Session 32 logged out. Waiting for processes to exit.
Jan 22 22:20:06 compute-0 systemd-logind[801]: Removed session 32.
Jan 22 22:20:06 compute-0 nova_compute[182725]: 2026-01-22 22:20:06.911 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:06 compute-0 nova_compute[182725]: 2026-01-22 22:20:06.912 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:06 compute-0 nova_compute[182725]: 2026-01-22 22:20:06.913 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:20:06 compute-0 nova_compute[182725]: 2026-01-22 22:20:06.913 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:20:07 compute-0 nova_compute[182725]: 2026-01-22 22:20:07.226 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:20:07 compute-0 nova_compute[182725]: 2026-01-22 22:20:07.226 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:20:07 compute-0 nova_compute[182725]: 2026-01-22 22:20:07.227 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:20:07 compute-0 nova_compute[182725]: 2026-01-22 22:20:07.227 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:20:07 compute-0 nova_compute[182725]: 2026-01-22 22:20:07.487 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.135 182729 DEBUG nova.compute.manager [req-3a083272-0d27-42f8-9726-be32e2d6babb req-6353b798-8d06-44fa-915a-d21a70001bb7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.137 182729 DEBUG oslo_concurrency.lockutils [req-3a083272-0d27-42f8-9726-be32e2d6babb req-6353b798-8d06-44fa-915a-d21a70001bb7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.138 182729 DEBUG oslo_concurrency.lockutils [req-3a083272-0d27-42f8-9726-be32e2d6babb req-6353b798-8d06-44fa-915a-d21a70001bb7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.138 182729 DEBUG oslo_concurrency.lockutils [req-3a083272-0d27-42f8-9726-be32e2d6babb req-6353b798-8d06-44fa-915a-d21a70001bb7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.138 182729 DEBUG nova.compute.manager [req-3a083272-0d27-42f8-9726-be32e2d6babb req-6353b798-8d06-44fa-915a-d21a70001bb7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.139 182729 DEBUG nova.compute.manager [req-3a083272-0d27-42f8-9726-be32e2d6babb req-6353b798-8d06-44fa-915a-d21a70001bb7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:20:08 compute-0 podman[214907]: 2026-01-22 22:20:08.180223115 +0000 UTC m=+0.100748137 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:20:08 compute-0 podman[214908]: 2026-01-22 22:20:08.182123093 +0000 UTC m=+0.102563892 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.558 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.656 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.684 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.685 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.686 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.693 182729 INFO nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Took 4.82 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.694 182729 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.715 182729 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp73gi6c2o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b484b5f7-0814-4161-b492-633788f2961f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(71b7dbf7-048d-47da-b5e0-c5906bcb8587),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.766 182729 DEBUG nova.objects.instance [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'migration_context' on Instance uuid b484b5f7-0814-4161-b492-633788f2961f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.767 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.769 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.770 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.790 182729 DEBUG nova.virt.libvirt.vif [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1962651622',display_name='tempest-LiveMigrationTest-server-1962651622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1962651622',id=30,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-rvizwmjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:19:59Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=b484b5f7-0814-4161-b492-633788f2961f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.790 182729 DEBUG nova.network.os_vif_util [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.791 182729 DEBUG nova.network.os_vif_util [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.792 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating guest XML with vif config: <interface type="ethernet">
Jan 22 22:20:08 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:ad:07:e9"/>
Jan 22 22:20:08 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:20:08 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:20:08 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:20:08 compute-0 nova_compute[182725]:   <target dev="tap7e88b712-be"/>
Jan 22 22:20:08 compute-0 nova_compute[182725]: </interface>
Jan 22 22:20:08 compute-0 nova_compute[182725]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.793 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.912 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.912 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:08 compute-0 nova_compute[182725]: 2026-01-22 22:20:08.913 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.027 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.099 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.101 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.110 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b484b5f7-0814-4161-b492-633788f2961f', 'name': 'tempest-LiveMigrationTest-server-1962651622', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9ead4241c55147dcbe136a6d6a69a60f', 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'hostId': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.116 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'name': 'tempest-LiveMigrationTest-server-55126447', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000016', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9ead4241c55147dcbe136a6d6a69a60f', 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'hostId': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.149 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.151 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.159 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.170 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.194 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.write.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.195 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b30c6cb-016d-4eb1-9783-898fbc07c7d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.117149', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83073818-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '009bd54076657384984adb3a743a07b79f63ec291706f2a885b372de238da876'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.117149', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8307501e-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': 'e01dd3fc744df3029f2cd75a506672df27abe880cf67969c904f7b37302b3397'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.117149', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '830df98c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '428db6374a3f4ead0433b68fb921a0c9803e7407f34d996b87547c558c78fb91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.117149', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '830e1d22-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '508b409dc2e7b69bdc2c18daf05c193f840ccbabcec5e47d5b4f0c9d382f2286'}]}, 'timestamp': '2026-01-22 22:20:09.196417', '_unique_id': 'f9dea2ec1f9f4deea535da83ccb1d2c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.198 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.222 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.224 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.243 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.allocation volume: 30617600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.245 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d025322-6efd-4bbe-ab7c-ff5b63064e74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.205409', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83125eaa-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.862913471, 'message_signature': '669b8d38fe516708bda7258768cf4cb85a745e3a8b0aa583cf5a77b65005540e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.205409', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8312865a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.862913471, 'message_signature': '9dca2765b0798be6e8a0b6b4a90aad20901129bfe66ddb66ef4ccea93b40d9f2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30617600, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.205409', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '831586fc-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.882607306, 'message_signature': '013fa0ba7a906ea05a8d4c027f47190402b83ba29ee5106e5b3801fff721aa0e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.205409', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8315a894-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.882607306, 'message_signature': '1f4f3cc8d1442745aafc5f399c5727de745d2b06e06d3b77f6c5f1a4b37f13c5'}]}, 'timestamp': '2026-01-22 22:20:09.245819', '_unique_id': '6dd02111d66f485ebe06ac5a01e0926e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.247 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.253 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.254 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.255 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.write.latency volume: 4702580 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.255 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d80e9e8-fad1-448c-a821-efc69d4c8531', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.253579', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8317007c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '5e77ac3466f7dea5f87075d8e35e44203c032ee0ff2caba72fdcee33023d0e5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.253579', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83171d46-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': 'd5450d7871ea52920f3eeb3dfed193eb0963f95c1a7f2e63ae3f1cc6fccd7c82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4702580, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.253579', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83173646-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '9e44f7a3eb8c7040830ee2a81f259bb863f8735219454c5fb7380b92c58dcc90'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.253579', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83174ffa-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '38e13b7c41f7df5c3299a014d1b77dc7001a09ca2619a4dd42bc7fb7374ae2c9'}]}, 'timestamp': '2026-01-22 22:20:09.256573', '_unique_id': 'ea950c1345314f1d8ab039bffc18a79e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.260 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.262 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.268 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b484b5f7-0814-4161-b492-633788f2961f / tap7e88b712-be inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.269 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.273 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 469eaf2b-7d53-40c9-a233-b27d702a21ed / tap580dc508-63 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.273 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'facf503a-7f17-4e9b-b470-b2e99097d2d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.264086', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '83196380-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': '94fdc6eb39e491c26fa0dffafe32de7c5ae73062f6e1258e5c3284bf71f5ce11'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.264086', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '831a1118-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': 'bb7c187f32001a53dd2ccc490fc3a72afde45bd4fd57d931ae6281538657c66d'}]}, 'timestamp': '2026-01-22 22:20:09.274693', '_unique_id': '932d3be5043a4b15ba55247488cb18f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.276 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.282 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.283 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.284 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.284 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.285 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.286 182729 INFO nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64cd7f17-8aca-4993-8c5e-817c441de150', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.282310', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '831b6298-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '31bdbd8ab8f87739676133ac6dd31f9563d68350d7370e1f1eac595be5899cc9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.282310', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '831b7da0-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '160bd48323142d0afb0239a1ca6099c4ec86c1b2d927a1165268b8a4d64b7805'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.282310', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '831b9a88-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': 'ce4982dbd82a48ac3f09cf726a7f5b7a36b53b3329e106d911f71c8c5e1e4130'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.282310', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '831bb4dc-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '580194c92581cbda5b79ea1980d5853d26b7bd25f62631e17e58a6c7a991e88f'}]}, 'timestamp': '2026-01-22 22:20:09.285423', '_unique_id': '8d92d2eb0ee148c7ae6958df0d0a9573'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.286 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.288 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.289 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0dd57cd8-9f19-484b-bccd-06ce9835f570', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.288728', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '831c4d66-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': 'dd95603739a7a6c468fc667c1d61f95fc613fdbb8d104f25ebfca7378f2cd319'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.288728', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '831c662a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': 'c3bb9fe52dd5faa803bedc64414a2a0c7f2fa1f8d5fd21477e8e245e9ee1360f'}]}, 'timestamp': '2026-01-22 22:20:09.289978', '_unique_id': 'f3ddc71a50224f48be088130e1845bcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.291 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.292 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.292 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ec9ded4-c0ba-49f5-8824-b9d9e987f01b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.292081', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '831cca3e-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': 'bdd6b4438919de6c6db5adfaa5682ba3bb8557c432805724eecb1d13c2e6dda9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.292081', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '831cd51a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': '4d2f5582aac6cad3e9199373397739380cbe733e6ac8694ebaf27c0886508a30'}]}, 'timestamp': '2026-01-22 22:20:09.292656', '_unique_id': '8715f03e96bf4da4802899b1898791ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.293 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.294 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.294 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.294 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>]
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.294 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.319 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/cpu volume: 9830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.332 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.339 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/cpu volume: 200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c57d5fa-200b-4001-9025-ebd5d56a75c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9830000000, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'timestamp': '2026-01-22T22:20:09.294558', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8320f1fe-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.976281474, 'message_signature': 'a6aacea489c2830ceaf787ce4c68f077d906f3901a606f83e0afbedad784edbd'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200000000, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'timestamp': '2026-01-22T22:20:09.294558', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '83240830-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.996248586, 'message_signature': 'cb669a8b56097052d0c60fbdbebc39d5612c5e2b3f15bd9fe6f855f105d5dd8f'}]}, 'timestamp': '2026-01-22 22:20:09.340199', '_unique_id': '23fcd00871a147f48a02f8b1f518e166'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.344 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.345 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.incoming.bytes volume: 706 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ea9b331-f793-4358-9c13-3502a31e8ba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.344409', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '8324d0bc-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': '89fa63fdd1840a784a45dc95df1b68fda7341a6aa809d419fa2ec2f778fc4699'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 706, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.344409', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '8324ee30-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': '7d91b44c32865557662dc1f62a2c00056d76e006146ff6dde544aa5123bb2673'}]}, 'timestamp': '2026-01-22 22:20:09.345999', '_unique_id': 'fdb1b6e997494cbc987f3c583c7f7e70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.347 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.348 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.348 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.348 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.349 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.write.bytes volume: 28672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.349 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33d9a642-f68c-43b5-8cf8-50acb0e45c82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.348384', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83256324-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '5394e27600f0ca599eeae010efb4e533d872f77e7c434945218faefc4122f0da'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.348384', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83256f2c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '7ec167c45ab4d7152e7761bd9c0016a4abe6da302e84258e957c2386a33f21de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28672, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.348384', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '832579e0-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '0214f64b53d7afad9273150902b13d5bceb8037d3432b4f14d4b1fd2e7b22ebe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.348384', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8325864c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '5867134e4384cea56ee5cc30d967c2494d8d0baaf092b07dde6993580f4b018c'}]}, 'timestamp': '2026-01-22 22:20:09.349623', '_unique_id': '9c37607d74474358a8cabe2470967932'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.350 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.351 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.351 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e802459-ca03-4322-91a3-d0a248ab6cd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.351319', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '8325d48a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': '8a0c221445b577ea2c144b03955cc4399d7e07a5b995c520fb5d70a0e019ff3f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.351319', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '8325dfd4-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': '2ad48daf3113034504d34a98cf61814adbfee37eebb79a56113b93744cc5c96b'}]}, 'timestamp': '2026-01-22 22:20:09.351941', '_unique_id': 'bd79e3ad0c1b4136b26b41d7dd21b8c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.352 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.353 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.353 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>]
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.353 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.354 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.354 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/memory.usage volume: 42.125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2624c2b-34d7-462e-bc27-f0a4ace72cd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'timestamp': '2026-01-22T22:20:09.354021', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '83263da8-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.976281474, 'message_signature': '9bbdc6d057985536fb09e109c55118362be172e3e28b43c922e12a28810760e8'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.125, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'timestamp': '2026-01-22T22:20:09.354021', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '832648a2-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.996248586, 'message_signature': 'cb3ac69b67c0dfb18869b8b06aa848e24ff707559e8b4ec7a1fad3c016a1e83e'}]}, 'timestamp': '2026-01-22 22:20:09.354619', '_unique_id': '90234a242b6a4005979a8f93fd2bb7ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.355 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.356 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.356 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.356 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4338f4f4-00b6-4e6f-aacb-02d56a729b9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.356226', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '8326941a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': '8027398c30df64b9ce11146e5aa11ee4d00875f5462bdbf46752940cddfe81bd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.356226', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '83269f46-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': '80b354d75096948a23520ab364014201a7b47a1d0f595a570c19706662723f87'}]}, 'timestamp': '2026-01-22 22:20:09.357460', '_unique_id': '6b469bd8e17449e6947de3d6d8bd7e8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.358 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.359 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.359 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>]
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.359 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.359 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.360 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.360 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '185249be-ef2f-4034-8e04-e39b6a6b8167', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.359509', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8327146c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.862913471, 'message_signature': '337e467b0363764a7f0ee8dcd9513bcc2ab51c72bea065886537f37f12cb0670'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.359509', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '832721f0-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.862913471, 'message_signature': '65c09f8132868cc65808bd8c964d7b5121d7e2f7630e90a1f3b915b44b4a36d4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.359509', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83272e7a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.882607306, 'message_signature': 'fd6fdfb16aa3fa66162f224dde0fdb94b16ead40387bd02faced717f9627b6e7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.359509', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83273a6e-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.882607306, 'message_signature': 'd9581041aea03d2b524f316549cc64317a5de22e83509d49d7ef6d9cffc301da'}]}, 'timestamp': '2026-01-22 22:20:09.360780', '_unique_id': '1828629a1f1346cc9a85d1d1e1f088db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.361 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.362 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.362 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.362 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '771166a5-dd52-4e0c-a832-e5086909aa52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.362365', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '832785dc-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': '0b6aa9711de94fa2c8770723fead7a5a6b0e38a150a701bcf6e50b53070bc027'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.362365', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '83279cd4-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': '7eb3fc24adeab184f2b0c92088e59a51728ed4d9e28aafc410b6faa8c17660d0'}]}, 'timestamp': '2026-01-22 22:20:09.363318', '_unique_id': '8af58a2326784a96b14d83d29d2a11ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.364 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.365 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1962651622>, <NovaLikeServer: tempest-LiveMigrationTest-server-55126447>]
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.365 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.365 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.366 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43dc783d-4b5e-4817-9707-c9bb50f01677', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.365350', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '832806a6-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': 'b408d9c6e496436ab83d0b71dc16b671116bc2e434f73e19ee17680b9de0cd9c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.365350', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '832812b8-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': 'eb1c4a1f2e265660145e15488bf50af5791aaf1727d4e021eec42821f5db6a74'}]}, 'timestamp': '2026-01-22 22:20:09.366329', '_unique_id': '9dabb4f442ef45da99e0aac53f86272e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.368 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.368 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.368 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfa28a42-f5f9-40f7-98b1-ba1623fed452', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.368171', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '832866a0-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': 'a34c6b4d726118695fbff5affcef62ff04fe50ea785816c9c5bea5b68e88ed88'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.368171', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '8328726c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': '4ff521f735a823f04a044d5883c7cdba6341be1652034a7a46f51810302e84d9'}]}, 'timestamp': '2026-01-22 22:20:09.368808', '_unique_id': 'fca30c228b124a00804adb43658ad978'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.370 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.370 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.read.latency volume: 154242357 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.370 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.read.latency volume: 324098 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.371 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.371 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6623af5f-bb9f-4ab9-a6aa-360003bf0971', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 154242357, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.370605', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8328c622-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '8f8cce4cea56c85a58baacfe2143504fd09bdc9767a3b8d8510e1a63c2d81d94'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 324098, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.370605', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8328d25c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': 'f2feba8a9ce37c1d8f7cde31add7378de04133b549a0ef4afc37ca2a3d0555ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.370605', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8328de3c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '991bec2298c7bf2546d8346e22ef8507c3442320e7d74ffafc7f27eb9aa9ce35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.370605', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8328e85a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '6eef5f34494e3c6840c5d34b50ff0b4600bbcba50429ca97ee37c8ed321d8ed1'}]}, 'timestamp': '2026-01-22 22:20:09.371835', '_unique_id': 'd16fc38d265d4f32893441aea89116e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.373 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.373 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.373 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.374 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.374 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ba5772b-fca6-472b-bbf9-71be15c41172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.373454', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83293616-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.862913471, 'message_signature': '1e00048d43a0e086e0bfa0ddccf939c801bc54e51ed9c9382573222200091461'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.373454', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83294368-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.862913471, 'message_signature': 'f03c08e45eac3db54c8305d29a05616eb44794fccd64c169426178d7d6f1c3ef'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.373454', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83294dcc-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.882607306, 'message_signature': '3c10a92d3bfbc6a8a75369e8fec46b0476db533b5736297ef5870658a15ad26a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.373454', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '832969c4-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.882607306, 'message_signature': 'fa7ff7e0c9dae52987bc2b1f37a7f93440b00c42360ef09a8689f415da468e2d'}]}, 'timestamp': '2026-01-22 22:20:09.375748', '_unique_id': '425fb52ec0cf4e88999a8ee4c24d1da4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.376 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.377 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.377 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.378 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3471f28e-3bad-4084-9fb2-66f8b312f0d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-0000001e-b484b5f7-0814-4161-b492-633788f2961f-tap7e88b712-be', 'timestamp': '2026-01-22T22:20:09.377644', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'tap7e88b712-be', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ad:07:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7e88b712-be'}, 'message_id': '8329da62-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.921583037, 'message_signature': 'ea118f95f0e423a7b9bdaaa56ccaf790c5f29966f4bd7c2e39b09968efa98e1f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'instance-00000016-469eaf2b-7d53-40c9-a233-b27d702a21ed-tap580dc508-63', 'timestamp': '2026-01-22T22:20:09.377644', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'tap580dc508-63', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:a3:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap580dc508-63'}, 'message_id': '8329e78c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.927463115, 'message_signature': '81231d95319acc4ef5bfcf1218b17653753ed30c888ececc47784fa19ff16819'}]}, 'timestamp': '2026-01-22 22:20:09.378332', '_unique_id': '1b031b0a30944ecba25f2960ba88fb64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.380 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.381 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.381 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.381 12 DEBUG ceilometer.compute.pollsters [-] b484b5f7-0814-4161-b492-633788f2961f/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.382 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.382 12 DEBUG ceilometer.compute.pollsters [-] 469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfcefb95-332c-4022-ae0a-7f80bef992c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-vda', 'timestamp': '2026-01-22T22:20:09.381617', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '832a73aa-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '3e669fb161d5ac691b147166e4e50496a7666f6e6b16b3f9c982587ee6b4920a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': 'b484b5f7-0814-4161-b492-633788f2961f-sda', 'timestamp': '2026-01-22T22:20:09.381617', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1962651622', 'name': 'instance-0000001e', 'instance_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '832a7f3a-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.774454324, 'message_signature': '6a6cfceec31c837f1397a514eb44624d8eb306ff0e508133addbe9fce13d20da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-vda', 'timestamp': '2026-01-22T22:20:09.381617', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '832a8c46-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': '164df09ce4b0536bdb11cd82fc0f8b48cf1041f9a0b84b246b218774b9ecddcd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06b4b3807dc64d83b8bfbbf0c4d31d77', 'user_name': None, 'project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'project_name': None, 'resource_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed-sda', 'timestamp': '2026-01-22T22:20:09.381617', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-55126447', 'name': 'instance-00000016', 'instance_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'instance_type': 'm1.nano', 'host': '4f32819645ec419ba6321a0a61dcf28c55c0d7eed5b86153bfa344f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '832aa47e-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4046.808963403, 'message_signature': 'e6d1a974dce6c01688eb9a8525006d15d5b5c6536e4e70416340191768163ae0'}]}, 'timestamp': '2026-01-22 22:20:09.383176', '_unique_id': 'c7dfe3c3698d4e31916680b33585cdfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:20:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:20:09.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.410 182729 INFO nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.598 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.599 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5394MB free_disk=73.35121536254883GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.600 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.600 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.650 182729 INFO nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating resource usage from migration 71b7dbf7-048d-47da-b5e0-c5906bcb8587
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.680 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 469eaf2b-7d53-40c9-a233-b27d702a21ed actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.681 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Migration 71b7dbf7-048d-47da-b5e0-c5906bcb8587 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.681 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.682 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.752 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.770 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.800 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.800 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.913 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:20:09 compute-0 nova_compute[182725]: 2026-01-22 22:20:09.914 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.390 182729 DEBUG nova.compute.manager [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.391 182729 DEBUG oslo_concurrency.lockutils [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.392 182729 DEBUG oslo_concurrency.lockutils [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.392 182729 DEBUG oslo_concurrency.lockutils [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.392 182729 DEBUG nova.compute.manager [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.393 182729 WARNING nova.compute.manager [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received unexpected event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with vm_state active and task_state migrating.
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.393 182729 DEBUG nova.compute.manager [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-changed-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.394 182729 DEBUG nova.compute.manager [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Refreshing instance network info cache due to event network-changed-7e88b712-bef4-4434-b405-04af2a2d3d0f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.394 182729 DEBUG oslo_concurrency.lockutils [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.395 182729 DEBUG oslo_concurrency.lockutils [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.395 182729 DEBUG nova.network.neutron [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Refreshing network info cache for port 7e88b712-bef4-4434-b405-04af2a2d3d0f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.417 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.418 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.921 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:20:10 compute-0 nova_compute[182725]: 2026-01-22 22:20:10.922 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:20:11 compute-0 nova_compute[182725]: 2026-01-22 22:20:11.425 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:20:11 compute-0 nova_compute[182725]: 2026-01-22 22:20:11.426 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:20:11 compute-0 nova_compute[182725]: 2026-01-22 22:20:11.773 182729 DEBUG nova.network.neutron [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updated VIF entry in instance network info cache for port 7e88b712-bef4-4434-b405-04af2a2d3d0f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:20:11 compute-0 nova_compute[182725]: 2026-01-22 22:20:11.774 182729 DEBUG nova.network.neutron [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating instance_info_cache with network_info: [{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:20:11 compute-0 nova_compute[182725]: 2026-01-22 22:20:11.803 182729 DEBUG oslo_concurrency.lockutils [req-433929f8-1ee3-4d1a-9da8-fa4cfcee3974 req-1920c380-3c2b-4cb0-9890-4072cf70e2da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:20:11 compute-0 nova_compute[182725]: 2026-01-22 22:20:11.932 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:20:11 compute-0 nova_compute[182725]: 2026-01-22 22:20:11.932 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:20:12 compute-0 podman[214986]: 2026-01-22 22:20:12.143231626 +0000 UTC m=+0.072559517 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.353 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120412.352593, b484b5f7-0814-4161-b492-633788f2961f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.354 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Paused (Lifecycle Event)
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.379 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.427 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.429 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.430 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.452 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.473 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.492 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.504 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.505 182729 DEBUG nova.virt.libvirt.migration [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 22:20:12 compute-0 kernel: tap7e88b712-be (unregistering): left promiscuous mode
Jan 22 22:20:12 compute-0 NetworkManager[54954]: <info>  [1769120412.6598] device (tap7e88b712-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.672 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:12 compute-0 ovn_controller[94850]: 2026-01-22T22:20:12Z|00101|binding|INFO|Releasing lport 7e88b712-bef4-4434-b405-04af2a2d3d0f from this chassis (sb_readonly=0)
Jan 22 22:20:12 compute-0 ovn_controller[94850]: 2026-01-22T22:20:12Z|00102|binding|INFO|Setting lport 7e88b712-bef4-4434-b405-04af2a2d3d0f down in Southbound
Jan 22 22:20:12 compute-0 ovn_controller[94850]: 2026-01-22T22:20:12Z|00103|binding|INFO|Releasing lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 from this chassis (sb_readonly=0)
Jan 22 22:20:12 compute-0 ovn_controller[94850]: 2026-01-22T22:20:12Z|00104|binding|INFO|Setting lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 down in Southbound
Jan 22 22:20:12 compute-0 ovn_controller[94850]: 2026-01-22T22:20:12Z|00105|binding|INFO|Removing iface tap7e88b712-be ovn-installed in OVS
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.677 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:12 compute-0 ovn_controller[94850]: 2026-01-22T22:20:12Z|00106|binding|INFO|Releasing lport a18a2be2-f1a5-44ce-96ac-2c546dab3eef from this chassis (sb_readonly=0)
Jan 22 22:20:12 compute-0 ovn_controller[94850]: 2026-01-22T22:20:12Z|00107|binding|INFO|Releasing lport 65e3ee7d-2176-49c3-aeee-08b035ff0bbf from this chassis (sb_readonly=0)
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.699 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:07:e9 10.100.0.14'], port_security=['fa:16:3e:ad:07:e9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e130c2ec-fef7-4ed2-892d-1e3d7eaab401'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-850980191', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-850980191', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=7e88b712-bef4-4434-b405-04af2a2d3d0f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.703 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:a8:b5 19.80.0.59'], port_security=['fa:16:3e:ac:a8:b5 19.80.0.59'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['7e88b712-bef4-4434-b405-04af2a2d3d0f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1993138940', 'neutron:cidrs': '19.80.0.59/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1993138940', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=529a2a70-69a8-4f19-a951-f1c58852ecd0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a87b1642-1ac3-4b35-809d-79c74a2f4e13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.706 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 7e88b712-bef4-4434-b405-04af2a2d3d0f in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea unbound from our chassis
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.709 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.723 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.734 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[51845449-2594-4e23-96bc-354066986986]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.745 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:12 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 22 22:20:12 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001e.scope: Consumed 12.744s CPU time.
Jan 22 22:20:12 compute-0 systemd-machined[154006]: Machine qemu-13-instance-0000001e terminated.
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.782 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6e9ccf0f-f229-4760-8a28-56c246246ad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.787 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3446cc04-765c-4810-84ac-a18c7706bd7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.801 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.828 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[aae55526-d4e1-411c-a60e-581204546bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.855 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[56bacf54-e294-4187-905b-e76f582aa18b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401589, 'reachable_time': 22080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215021, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.884 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9eb051-8d45-4034-b390-3a027f5b531f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap698e77c5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401604, 'tstamp': 401604}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215025, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap698e77c5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401609, 'tstamp': 401609}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215025, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.887 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.891 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.897 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.899 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap698e77c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.900 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.900 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap698e77c5-f0, col_values=(('external_ids', {'iface-id': 'a18a2be2-f1a5-44ce-96ac-2c546dab3eef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.901 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.903 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a87b1642-1ac3-4b35-809d-79c74a2f4e13 in datapath 75073b6a-f711-4d82-9e11-07cd8a1d16e2 unbound from our chassis
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.905 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75073b6a-f711-4d82-9e11-07cd8a1d16e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.906 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a6d663-0e79-4ffc-b964-5be515791b26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:12.906 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 namespace which is not needed anymore
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.929 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.929 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 22 22:20:12 compute-0 nova_compute[182725]: 2026-01-22 22:20:12.929 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.008 182729 DEBUG nova.virt.libvirt.guest [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'b484b5f7-0814-4161-b492-633788f2961f' (instance-0000001e) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.009 182729 INFO nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Migration operation has completed
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.009 182729 INFO nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] _post_live_migration() is started..
Jan 22 22:20:13 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [NOTICE]   (214860) : haproxy version is 2.8.14-c23fe91
Jan 22 22:20:13 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [NOTICE]   (214860) : path to executable is /usr/sbin/haproxy
Jan 22 22:20:13 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [WARNING]  (214860) : Exiting Master process...
Jan 22 22:20:13 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [WARNING]  (214860) : Exiting Master process...
Jan 22 22:20:13 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [ALERT]    (214860) : Current worker (214862) exited with code 143 (Terminated)
Jan 22 22:20:13 compute-0 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[214856]: [WARNING]  (214860) : All workers exited. Exiting... (0)
Jan 22 22:20:13 compute-0 systemd[1]: libpod-90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625.scope: Deactivated successfully.
Jan 22 22:20:13 compute-0 podman[215058]: 2026-01-22 22:20:13.078300628 +0000 UTC m=+0.056309478 container died 90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.111 182729 DEBUG nova.compute.manager [req-91cc6873-1966-4f7f-8937-661e1f5cb7a6 req-47376934-92f2-4b02-a89e-9f0d68a64164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.112 182729 DEBUG oslo_concurrency.lockutils [req-91cc6873-1966-4f7f-8937-661e1f5cb7a6 req-47376934-92f2-4b02-a89e-9f0d68a64164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.112 182729 DEBUG oslo_concurrency.lockutils [req-91cc6873-1966-4f7f-8937-661e1f5cb7a6 req-47376934-92f2-4b02-a89e-9f0d68a64164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.113 182729 DEBUG oslo_concurrency.lockutils [req-91cc6873-1966-4f7f-8937-661e1f5cb7a6 req-47376934-92f2-4b02-a89e-9f0d68a64164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.113 182729 DEBUG nova.compute.manager [req-91cc6873-1966-4f7f-8937-661e1f5cb7a6 req-47376934-92f2-4b02-a89e-9f0d68a64164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.113 182729 DEBUG nova.compute.manager [req-91cc6873-1966-4f7f-8937-661e1f5cb7a6 req-47376934-92f2-4b02-a89e-9f0d68a64164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:20:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625-userdata-shm.mount: Deactivated successfully.
Jan 22 22:20:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-82f4eaa72c1cdb5c949454a9e3954e231d0403b18da579c66c78a9f7314213da-merged.mount: Deactivated successfully.
Jan 22 22:20:13 compute-0 podman[215058]: 2026-01-22 22:20:13.129760763 +0000 UTC m=+0.107769603 container cleanup 90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:20:13 compute-0 systemd[1]: libpod-conmon-90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625.scope: Deactivated successfully.
Jan 22 22:20:13 compute-0 podman[215089]: 2026-01-22 22:20:13.239412272 +0000 UTC m=+0.082155398 container remove 90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.249 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4189324f-edcb-47d0-a634-69835980322f]: (4, ('Thu Jan 22 10:20:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 (90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625)\n90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625\nThu Jan 22 10:20:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 (90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625)\n90ec3ae7214f22e06181bd449a8b9e5f991c217a7347e81e6da631abf0258625\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.252 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4f971d18-2467-48db-b0de-27d2d7d28f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.254 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75073b6a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.256 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:13 compute-0 kernel: tap75073b6a-f0: left promiscuous mode
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.274 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.278 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[445535d9-c9d2-4ca8-b741-3ec9eb2c22ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.295 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d348ca51-d3f9-4a1b-947e-e392b690e157]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.297 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[be77ea95-50a2-478c-a7a5-7566cdecd52c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.315 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5b898916-cf4c-46ac-bbe4-ea80d831cb94]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403615, 'reachable_time': 38040, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215114, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.319 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:20:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:13.320 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[340b1f07-e9cd-405b-a260-7242930df8a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:13 compute-0 systemd[1]: run-netns-ovnmeta\x2d75073b6a\x2df711\x2d4d82\x2d9e11\x2d07cd8a1d16e2.mount: Deactivated successfully.
Jan 22 22:20:13 compute-0 nova_compute[182725]: 2026-01-22 22:20:13.560 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.042 182729 DEBUG nova.network.neutron [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Activated binding for port 7e88b712-bef4-4434-b405-04af2a2d3d0f and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.042 182729 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.043 182729 DEBUG nova.virt.libvirt.vif [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1962651622',display_name='tempest-LiveMigrationTest-server-1962651622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1962651622',id=30,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-rvizwmjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:20:01Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=b484b5f7-0814-4161-b492-633788f2961f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.043 182729 DEBUG nova.network.os_vif_util [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.044 182729 DEBUG nova.network.os_vif_util [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.044 182729 DEBUG os_vif [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.046 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.046 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e88b712-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.048 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.050 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.052 182729 INFO os_vif [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be')
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.053 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.053 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.054 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.054 182729 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.054 182729 INFO nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Deleting instance files /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f_del
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.055 182729 INFO nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Deletion of /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f_del complete
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.177 182729 DEBUG nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.178 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.178 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.178 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.178 182729 DEBUG nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.178 182729 WARNING nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received unexpected event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with vm_state active and task_state migrating.
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.179 182729 DEBUG nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.179 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.179 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.179 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.179 182729 DEBUG nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.179 182729 WARNING nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received unexpected event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with vm_state active and task_state migrating.
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.179 182729 DEBUG nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.180 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.180 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.180 182729 DEBUG oslo_concurrency.lockutils [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.180 182729 DEBUG nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:15 compute-0 nova_compute[182725]: 2026-01-22 22:20:15.180 182729 DEBUG nova.compute.manager [req-21a82b79-f58c-4075-9a48-247ee068a3a1 req-c69e61dd-e8f9-4143-ba4d-2508ef74b34e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:20:16 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 22:20:16 compute-0 systemd[214890]: Activating special unit Exit the Session...
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped target Main User Target.
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped target Basic System.
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped target Paths.
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped target Sockets.
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped target Timers.
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:20:16 compute-0 systemd[214890]: Closed D-Bus User Message Bus Socket.
Jan 22 22:20:16 compute-0 systemd[214890]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:20:16 compute-0 systemd[214890]: Removed slice User Application Slice.
Jan 22 22:20:16 compute-0 systemd[214890]: Reached target Shutdown.
Jan 22 22:20:16 compute-0 systemd[214890]: Finished Exit the Session.
Jan 22 22:20:16 compute-0 systemd[214890]: Reached target Exit the Session.
Jan 22 22:20:16 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 22:20:16 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 22:20:16 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 22:20:16 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 22:20:16 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 22:20:16 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 22:20:16 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.340 182729 DEBUG nova.compute.manager [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.340 182729 DEBUG oslo_concurrency.lockutils [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.341 182729 DEBUG oslo_concurrency.lockutils [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.341 182729 DEBUG oslo_concurrency.lockutils [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.342 182729 DEBUG nova.compute.manager [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.342 182729 WARNING nova.compute.manager [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received unexpected event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with vm_state active and task_state migrating.
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.342 182729 DEBUG nova.compute.manager [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.343 182729 DEBUG oslo_concurrency.lockutils [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.344 182729 DEBUG oslo_concurrency.lockutils [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.344 182729 DEBUG oslo_concurrency.lockutils [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.345 182729 DEBUG nova.compute.manager [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:17 compute-0 nova_compute[182725]: 2026-01-22 22:20:17.345 182729 WARNING nova.compute.manager [req-5b633003-6215-4f93-8c09-7291c651d29a req-d06a2fd4-4641-452c-a0f7-c7ebf560d9e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received unexpected event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with vm_state active and task_state migrating.
Jan 22 22:20:18 compute-0 nova_compute[182725]: 2026-01-22 22:20:18.563 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:20 compute-0 nova_compute[182725]: 2026-01-22 22:20:20.050 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.033 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.034 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.034 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.056 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.057 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.057 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.057 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.126 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.191 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.193 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.252 182729 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.407 182729 WARNING nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.408 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5515MB free_disk=73.34810638427734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.409 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.409 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.460 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Migration for instance b484b5f7-0814-4161-b492-633788f2961f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.479 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.510 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Instance 469eaf2b-7d53-40c9-a233-b27d702a21ed actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.511 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Migration 71b7dbf7-048d-47da-b5e0-c5906bcb8587 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.511 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.511 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.570 182729 DEBUG nova.compute.provider_tree [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.592 182729 DEBUG nova.scheduler.client.report [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.616 182729 DEBUG nova.compute.resource_tracker [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.616 182729 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.631 182729 INFO nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.724 182729 INFO nova.scheduler.client.report [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Deleted allocation for migration 71b7dbf7-048d-47da-b5e0-c5906bcb8587
Jan 22 22:20:21 compute-0 nova_compute[182725]: 2026-01-22 22:20:21.725 182729 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 22 22:20:23 compute-0 nova_compute[182725]: 2026-01-22 22:20:23.566 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:25 compute-0 nova_compute[182725]: 2026-01-22 22:20:25.054 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:25 compute-0 podman[215123]: 2026-01-22 22:20:25.200908459 +0000 UTC m=+0.117912538 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:20:27 compute-0 nova_compute[182725]: 2026-01-22 22:20:27.925 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120412.925135, b484b5f7-0814-4161-b492-633788f2961f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:27 compute-0 nova_compute[182725]: 2026-01-22 22:20:27.926 182729 INFO nova.compute.manager [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Stopped (Lifecycle Event)
Jan 22 22:20:27 compute-0 nova_compute[182725]: 2026-01-22 22:20:27.967 182729 DEBUG nova.compute.manager [None req-7f6a9718-0c5f-490f-9770-592ce400602f - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:28 compute-0 podman[215145]: 2026-01-22 22:20:28.195373667 +0000 UTC m=+0.116382800 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 22 22:20:28 compute-0 podman[215144]: 2026-01-22 22:20:28.237345003 +0000 UTC m=+0.156998802 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 22:20:28 compute-0 nova_compute[182725]: 2026-01-22 22:20:28.570 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.753 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.754 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.754 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.755 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.755 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.771 182729 INFO nova.compute.manager [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Terminating instance
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.788 182729 DEBUG nova.compute.manager [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:20:29 compute-0 kernel: tap580dc508-63 (unregistering): left promiscuous mode
Jan 22 22:20:29 compute-0 NetworkManager[54954]: <info>  [1769120429.8116] device (tap580dc508-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:20:29 compute-0 ovn_controller[94850]: 2026-01-22T22:20:29Z|00108|binding|INFO|Releasing lport 580dc508-636a-420e-aed2-8efd9dccace5 from this chassis (sb_readonly=0)
Jan 22 22:20:29 compute-0 ovn_controller[94850]: 2026-01-22T22:20:29Z|00109|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 down in Southbound
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.823 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:29 compute-0 ovn_controller[94850]: 2026-01-22T22:20:29Z|00110|binding|INFO|Removing iface tap580dc508-63 ovn-installed in OVS
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.827 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:29.832 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:a3:f5 10.100.0.6'], port_security=['fa:16:3e:01:a3:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=580dc508-636a-420e-aed2-8efd9dccace5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:20:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:29.834 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 580dc508-636a-420e-aed2-8efd9dccace5 in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea unbound from our chassis
Jan 22 22:20:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:29.836 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:20:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:29.838 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ed3eb3-1c57-466f-94fc-066a86e005dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:29.839 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace which is not needed anymore
Jan 22 22:20:29 compute-0 nova_compute[182725]: 2026-01-22 22:20:29.855 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:29 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 22 22:20:29 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000016.scope: Consumed 3.882s CPU time.
Jan 22 22:20:29 compute-0 systemd-machined[154006]: Machine qemu-12-instance-00000016 terminated.
Jan 22 22:20:30 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214531]: [NOTICE]   (214535) : haproxy version is 2.8.14-c23fe91
Jan 22 22:20:30 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214531]: [NOTICE]   (214535) : path to executable is /usr/sbin/haproxy
Jan 22 22:20:30 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214531]: [WARNING]  (214535) : Exiting Master process...
Jan 22 22:20:30 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214531]: [ALERT]    (214535) : Current worker (214537) exited with code 143 (Terminated)
Jan 22 22:20:30 compute-0 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214531]: [WARNING]  (214535) : All workers exited. Exiting... (0)
Jan 22 22:20:30 compute-0 systemd[1]: libpod-5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8.scope: Deactivated successfully.
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.057 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:30 compute-0 podman[215215]: 2026-01-22 22:20:30.060219696 +0000 UTC m=+0.070709291 container died 5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.080 182729 INFO nova.virt.libvirt.driver [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Instance destroyed successfully.
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.082 182729 DEBUG nova.objects.instance [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lazy-loading 'resources' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.100 182729 DEBUG nova.virt.libvirt.vif [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:19:42Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.101 182729 DEBUG nova.network.os_vif_util [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.102 182729 DEBUG nova.network.os_vif_util [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.103 182729 DEBUG os_vif [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.106 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap580dc508-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8-userdata-shm.mount: Deactivated successfully.
Jan 22 22:20:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-859ec6cf10c9e8a8d79cb1286fe9a9ec3730c8f7d29da7c1e4ddcbdfb4090dd2-merged.mount: Deactivated successfully.
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.110 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.112 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.115 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:30 compute-0 podman[215215]: 2026-01-22 22:20:30.12000706 +0000 UTC m=+0.130496635 container cleanup 5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.119 182729 INFO os_vif [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63')
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.120 182729 INFO nova.virt.libvirt.driver [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Deleting instance files /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed_del
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.121 182729 INFO nova.virt.libvirt.driver [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Deletion of /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed_del complete
Jan 22 22:20:30 compute-0 systemd[1]: libpod-conmon-5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8.scope: Deactivated successfully.
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.191 182729 INFO nova.compute.manager [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.192 182729 DEBUG oslo.service.loopingcall [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.193 182729 DEBUG nova.compute.manager [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.194 182729 DEBUG nova.network.neutron [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:20:30 compute-0 podman[215260]: 2026-01-22 22:20:30.216235562 +0000 UTC m=+0.064152875 container remove 5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.223 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[28d23b9f-9b15-42a3-b2be-bd46394a9f8f]: (4, ('Thu Jan 22 10:20:29 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8)\n5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8\nThu Jan 22 10:20:30 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8)\n5e846f3a529e57bfccea5c8c538a05bd104768addf8ee7a989b9edb759022da8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.226 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[03070f7f-45d0-414e-b762-0868f2d63231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.227 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:30 compute-0 kernel: tap698e77c5-f0: left promiscuous mode
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.229 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.232 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.235 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbbaa6a-263f-4005-b6f2-2128a2924f99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.256 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.262 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4cdb2634-0b4b-4975-86e8-f41bd269b9e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.264 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0d88fd2e-44e2-4600-ade6-ed05e65355e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.290 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0e37075a-af21-402a-9ff7-9992f7ff830d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401578, 'reachable_time': 21678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215275, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d698e77c5\x2dfce6\x2d47a5\x2db6e3\x2df4c56da226ea.mount: Deactivated successfully.
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.297 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:20:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:30.297 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f201ff-b11e-4d64-b070-556547ae81ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.792 182729 DEBUG nova.network.neutron [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.826 182729 INFO nova.compute.manager [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Took 0.63 seconds to deallocate network for instance.
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.898 182729 DEBUG nova.compute.manager [req-8527777f-a063-4b92-9861-83872f9f103d req-2ff3fc92-6001-4b3a-91cd-0b41c04ba035 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-deleted-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.912 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.912 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:30 compute-0 nova_compute[182725]: 2026-01-22 22:20:30.986 182729 DEBUG nova.compute.provider_tree [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:20:31 compute-0 nova_compute[182725]: 2026-01-22 22:20:31.001 182729 DEBUG nova.scheduler.client.report [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:20:31 compute-0 nova_compute[182725]: 2026-01-22 22:20:31.023 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:31 compute-0 nova_compute[182725]: 2026-01-22 22:20:31.057 182729 INFO nova.scheduler.client.report [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Deleted allocations for instance 469eaf2b-7d53-40c9-a233-b27d702a21ed
Jan 22 22:20:31 compute-0 nova_compute[182725]: 2026-01-22 22:20:31.122 182729 DEBUG oslo_concurrency.lockutils [None req-70fbd3ee-ed49-4771-89f5-62b97f612492 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:32.127 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:20:32 compute-0 nova_compute[182725]: 2026-01-22 22:20:32.128 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:32.129 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:20:33 compute-0 nova_compute[182725]: 2026-01-22 22:20:33.573 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.109 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.161 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.825 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.826 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.851 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.967 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.969 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.979 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:20:35 compute-0 nova_compute[182725]: 2026-01-22 22:20:35.979 182729 INFO nova.compute.claims [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.091 182729 DEBUG nova.scheduler.client.report [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.119 182729 DEBUG nova.scheduler.client.report [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.120 182729 DEBUG nova.compute.provider_tree [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.134 182729 DEBUG nova.scheduler.client.report [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.159 182729 DEBUG nova.scheduler.client.report [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.216 182729 DEBUG nova.compute.provider_tree [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.230 182729 DEBUG nova.scheduler.client.report [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.259 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.261 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.329 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.329 182729 DEBUG nova.network.neutron [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.348 182729 INFO nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.370 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.476 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.478 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.478 182729 INFO nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Creating image(s)
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.480 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "/var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.480 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "/var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.481 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "/var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.498 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.560 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.562 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.563 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.579 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.648 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.649 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.703 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.704 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.704 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.779 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.780 182729 DEBUG nova.virt.disk.api [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Checking if we can resize image /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.781 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.805 182729 DEBUG nova.policy [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7de9ef653dde4c0e8525c15bc52dc809', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '25f7fcbe33ce4b5fb686827d79a71058', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.850 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.852 182729 DEBUG nova.virt.disk.api [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Cannot resize image /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.853 182729 DEBUG nova.objects.instance [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lazy-loading 'migration_context' on Instance uuid 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.878 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.879 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Ensure instance console log exists: /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.880 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.880 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:36 compute-0 nova_compute[182725]: 2026-01-22 22:20:36.881 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:37 compute-0 nova_compute[182725]: 2026-01-22 22:20:37.828 182729 DEBUG nova.network.neutron [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Successfully created port: 24c4cb21-9bec-4310-af64-2396a78ebb7f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:20:38 compute-0 nova_compute[182725]: 2026-01-22 22:20:38.575 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.056 182729 DEBUG nova.network.neutron [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Successfully updated port: 24c4cb21-9bec-4310-af64-2396a78ebb7f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.085 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.086 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquired lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.086 182729 DEBUG nova.network.neutron [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:20:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:39.131 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:39 compute-0 podman[215291]: 2026-01-22 22:20:39.152221871 +0000 UTC m=+0.072188737 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 22 22:20:39 compute-0 podman[215292]: 2026-01-22 22:20:39.177157958 +0000 UTC m=+0.088691363 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.210 182729 DEBUG nova.compute.manager [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-changed-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.210 182729 DEBUG nova.compute.manager [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Refreshing instance network info cache due to event network-changed-24c4cb21-9bec-4310-af64-2396a78ebb7f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.211 182729 DEBUG oslo_concurrency.lockutils [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:20:39 compute-0 nova_compute[182725]: 2026-01-22 22:20:39.282 182729 DEBUG nova.network.neutron [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.111 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.453 182729 DEBUG nova.network.neutron [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updating instance_info_cache with network_info: [{"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.471 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Releasing lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.472 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Instance network_info: |[{"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.472 182729 DEBUG oslo_concurrency.lockutils [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.474 182729 DEBUG nova.network.neutron [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Refreshing network info cache for port 24c4cb21-9bec-4310-af64-2396a78ebb7f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.480 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Start _get_guest_xml network_info=[{"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.487 182729 WARNING nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.492 182729 DEBUG nova.virt.libvirt.host [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.493 182729 DEBUG nova.virt.libvirt.host [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.496 182729 DEBUG nova.virt.libvirt.host [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.497 182729 DEBUG nova.virt.libvirt.host [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.498 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.499 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.499 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.499 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.500 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.500 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.500 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.500 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.501 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.501 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.501 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.502 182729 DEBUG nova.virt.hardware [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.506 182729 DEBUG nova.virt.libvirt.vif [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:20:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-278163831',display_name='tempest-FloatingIPsAssociationTestJSON-server-278163831',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-278163831',id=32,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25f7fcbe33ce4b5fb686827d79a71058',ramdisk_id='',reservation_id='r-dhf2apm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1815924143',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1815924143-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:20:36Z,user_data=None,user_id='7de9ef653dde4c0e8525c15bc52dc809',uuid=4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.507 182729 DEBUG nova.network.os_vif_util [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Converting VIF {"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.507 182729 DEBUG nova.network.os_vif_util [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:28:0e,bridge_name='br-int',has_traffic_filtering=True,id=24c4cb21-9bec-4310-af64-2396a78ebb7f,network=Network(558b365e-7816-4d24-b8e7-6a29cc71889d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24c4cb21-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.508 182729 DEBUG nova.objects.instance [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.523 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <uuid>4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9</uuid>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <name>instance-00000020</name>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-278163831</nova:name>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:20:40</nova:creationTime>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:user uuid="7de9ef653dde4c0e8525c15bc52dc809">tempest-FloatingIPsAssociationTestJSON-1815924143-project-member</nova:user>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:project uuid="25f7fcbe33ce4b5fb686827d79a71058">tempest-FloatingIPsAssociationTestJSON-1815924143</nova:project>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         <nova:port uuid="24c4cb21-9bec-4310-af64-2396a78ebb7f">
Jan 22 22:20:40 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <system>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <entry name="serial">4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9</entry>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <entry name="uuid">4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9</entry>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </system>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <os>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   </os>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <features>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   </features>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk.config"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:95:28:0e"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <target dev="tap24c4cb21-9b"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/console.log" append="off"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <video>
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </video>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:20:40 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:20:40 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:20:40 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:20:40 compute-0 nova_compute[182725]: </domain>
Jan 22 22:20:40 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.524 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Preparing to wait for external event network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.525 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.525 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.525 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.526 182729 DEBUG nova.virt.libvirt.vif [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:20:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-278163831',display_name='tempest-FloatingIPsAssociationTestJSON-server-278163831',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-278163831',id=32,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25f7fcbe33ce4b5fb686827d79a71058',ramdisk_id='',reservation_id='r-dhf2apm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1815924143',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1815924143-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:20:36Z,user_data=None,user_id='7de9ef653dde4c0e8525c15bc52dc809',uuid=4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.526 182729 DEBUG nova.network.os_vif_util [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Converting VIF {"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.527 182729 DEBUG nova.network.os_vif_util [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:28:0e,bridge_name='br-int',has_traffic_filtering=True,id=24c4cb21-9bec-4310-af64-2396a78ebb7f,network=Network(558b365e-7816-4d24-b8e7-6a29cc71889d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24c4cb21-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.528 182729 DEBUG os_vif [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:28:0e,bridge_name='br-int',has_traffic_filtering=True,id=24c4cb21-9bec-4310-af64-2396a78ebb7f,network=Network(558b365e-7816-4d24-b8e7-6a29cc71889d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24c4cb21-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.528 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.529 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.529 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.533 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.533 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c4cb21-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.534 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap24c4cb21-9b, col_values=(('external_ids', {'iface-id': '24c4cb21-9bec-4310-af64-2396a78ebb7f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:28:0e', 'vm-uuid': '4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:40 compute-0 NetworkManager[54954]: <info>  [1769120440.5370] manager: (tap24c4cb21-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.537 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.543 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.544 182729 INFO os_vif [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:28:0e,bridge_name='br-int',has_traffic_filtering=True,id=24c4cb21-9bec-4310-af64-2396a78ebb7f,network=Network(558b365e-7816-4d24-b8e7-6a29cc71889d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24c4cb21-9b')
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.598 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.598 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.599 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] No VIF found with MAC fa:16:3e:95:28:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:20:40 compute-0 nova_compute[182725]: 2026-01-22 22:20:40.599 182729 INFO nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Using config drive
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.093 182729 INFO nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Creating config drive at /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk.config
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.098 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wk3r9tg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.235 182729 DEBUG oslo_concurrency.processutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wk3r9tg" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:41 compute-0 kernel: tap24c4cb21-9b: entered promiscuous mode
Jan 22 22:20:41 compute-0 ovn_controller[94850]: 2026-01-22T22:20:41Z|00111|binding|INFO|Claiming lport 24c4cb21-9bec-4310-af64-2396a78ebb7f for this chassis.
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.329 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 ovn_controller[94850]: 2026-01-22T22:20:41Z|00112|binding|INFO|24c4cb21-9bec-4310-af64-2396a78ebb7f: Claiming fa:16:3e:95:28:0e 10.100.0.10
Jan 22 22:20:41 compute-0 NetworkManager[54954]: <info>  [1769120441.3321] manager: (tap24c4cb21-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.334 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.357 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:28:0e 10.100.0.10'], port_security=['fa:16:3e:95:28:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-558b365e-7816-4d24-b8e7-6a29cc71889d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25f7fcbe33ce4b5fb686827d79a71058', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fcdc89f3-4a1f-4042-81e3-e273744215dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbcc9ff3-1f39-47e2-9239-70f6e6478a90, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=24c4cb21-9bec-4310-af64-2396a78ebb7f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.358 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 24c4cb21-9bec-4310-af64-2396a78ebb7f in datapath 558b365e-7816-4d24-b8e7-6a29cc71889d bound to our chassis
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.359 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 558b365e-7816-4d24-b8e7-6a29cc71889d
Jan 22 22:20:41 compute-0 systemd-machined[154006]: New machine qemu-14-instance-00000020.
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.372 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[45ade8f2-262b-4ffa-99fe-3bc2f0d73423]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.374 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap558b365e-71 in ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.376 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap558b365e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.377 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[faeb0fa4-b501-4722-a5bf-4fb22674b6d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.378 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ac17e13a-b883-4fd5-a361-6806d3aa5b8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.397 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[37eb12ef-1898-44db-b76b-75079a0fb5cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000020.
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.411 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 ovn_controller[94850]: 2026-01-22T22:20:41Z|00113|binding|INFO|Setting lport 24c4cb21-9bec-4310-af64-2396a78ebb7f ovn-installed in OVS
Jan 22 22:20:41 compute-0 ovn_controller[94850]: 2026-01-22T22:20:41Z|00114|binding|INFO|Setting lport 24c4cb21-9bec-4310-af64-2396a78ebb7f up in Southbound
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.415 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[76b30695-d150-4234-b24e-85ae243bd548]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.416 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 systemd-udevd[215356]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.446 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[aac3ec21-9b59-4588-b3b8-c7c747afce19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 NetworkManager[54954]: <info>  [1769120441.4533] manager: (tap558b365e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.452 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b5162516-e7ed-4e04-8755-f5b6ae0a903e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 systemd-udevd[215361]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:20:41 compute-0 NetworkManager[54954]: <info>  [1769120441.4680] device (tap24c4cb21-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:20:41 compute-0 NetworkManager[54954]: <info>  [1769120441.4686] device (tap24c4cb21-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.487 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b0236022-7856-4a11-8488-0875cec9f76d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.491 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[88a1b0f0-c197-40b6-9f79-ff9110aeca44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 NetworkManager[54954]: <info>  [1769120441.5173] device (tap558b365e-70): carrier: link connected
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.523 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4b89f79f-fc29-4b7f-b4cf-f2c0565936a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.539 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[acae04ff-d6ee-4b92-a57c-433255218b0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap558b365e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:67:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407912, 'reachable_time': 21428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215384, 'error': None, 'target': 'ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.554 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bedc7f3a-bdec-4b98-a489-0538c20d01a1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:679d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 407912, 'tstamp': 407912}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215385, 'error': None, 'target': 'ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.570 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[de50e7c9-dc4a-4feb-baa5-e3897a8dec41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap558b365e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:67:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407912, 'reachable_time': 21428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215386, 'error': None, 'target': 'ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.604 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[60c4609a-da04-47a5-b451-85bdc6fcd08a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.663 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1a0b22-45fa-4e0e-86b2-08cac8dc0e1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.665 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap558b365e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.665 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.666 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap558b365e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:41 compute-0 kernel: tap558b365e-70: entered promiscuous mode
Jan 22 22:20:41 compute-0 NetworkManager[54954]: <info>  [1769120441.6687] manager: (tap558b365e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.668 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.670 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.672 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap558b365e-70, col_values=(('external_ids', {'iface-id': 'aa76405e-fde4-4806-a8b5-70eb3b070f83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.674 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 ovn_controller[94850]: 2026-01-22T22:20:41Z|00115|binding|INFO|Releasing lport aa76405e-fde4-4806-a8b5-70eb3b070f83 from this chassis (sb_readonly=0)
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.675 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.676 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120441.6756845, 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.677 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] VM Started (Lifecycle Event)
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.678 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/558b365e-7816-4d24-b8e7-6a29cc71889d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/558b365e-7816-4d24-b8e7-6a29cc71889d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.679 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[50c17a51-db0e-45b2-9bde-7e1d6a14ae94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.680 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-558b365e-7816-4d24-b8e7-6a29cc71889d
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/558b365e-7816-4d24-b8e7-6a29cc71889d.pid.haproxy
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 558b365e-7816-4d24-b8e7-6a29cc71889d
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:20:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:20:41.680 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d', 'env', 'PROCESS_TAG=haproxy-558b365e-7816-4d24-b8e7-6a29cc71889d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/558b365e-7816-4d24-b8e7-6a29cc71889d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.686 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.700 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.703 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120441.6757598, 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.704 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] VM Paused (Lifecycle Event)
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.719 182729 DEBUG nova.compute.manager [req-846d6c13-df40-414a-b38e-8c541a7ae8c3 req-029cd373-abbb-4d91-bc13-dc8f92583d0f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.720 182729 DEBUG oslo_concurrency.lockutils [req-846d6c13-df40-414a-b38e-8c541a7ae8c3 req-029cd373-abbb-4d91-bc13-dc8f92583d0f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.720 182729 DEBUG oslo_concurrency.lockutils [req-846d6c13-df40-414a-b38e-8c541a7ae8c3 req-029cd373-abbb-4d91-bc13-dc8f92583d0f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.721 182729 DEBUG oslo_concurrency.lockutils [req-846d6c13-df40-414a-b38e-8c541a7ae8c3 req-029cd373-abbb-4d91-bc13-dc8f92583d0f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.721 182729 DEBUG nova.compute.manager [req-846d6c13-df40-414a-b38e-8c541a7ae8c3 req-029cd373-abbb-4d91-bc13-dc8f92583d0f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Processing event network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.722 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.726 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.727 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.731 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120441.725768, 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.732 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] VM Resumed (Lifecycle Event)
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.734 182729 INFO nova.virt.libvirt.driver [-] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Instance spawned successfully.
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.735 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.757 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.757 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.758 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.758 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.759 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.760 182729 DEBUG nova.virt.libvirt.driver [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.767 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.770 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.793 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.833 182729 INFO nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Took 5.36 seconds to spawn the instance on the hypervisor.
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.834 182729 DEBUG nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.923 182729 INFO nova.compute.manager [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Took 5.99 seconds to build instance.
Jan 22 22:20:41 compute-0 nova_compute[182725]: 2026-01-22 22:20:41.962 182729 DEBUG oslo_concurrency.lockutils [None req-166a0544-1b92-4cdc-9a6a-10d9ef053886 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:42 compute-0 nova_compute[182725]: 2026-01-22 22:20:42.098 182729 DEBUG nova.network.neutron [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updated VIF entry in instance network info cache for port 24c4cb21-9bec-4310-af64-2396a78ebb7f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:20:42 compute-0 nova_compute[182725]: 2026-01-22 22:20:42.099 182729 DEBUG nova.network.neutron [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updating instance_info_cache with network_info: [{"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:20:42 compute-0 nova_compute[182725]: 2026-01-22 22:20:42.114 182729 DEBUG oslo_concurrency.lockutils [req-d5214522-a02e-46b1-8216-2e40fb97d642 req-9dd11dad-5b9f-4cc1-bda2-f3310ef71542 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:20:42 compute-0 podman[215422]: 2026-01-22 22:20:42.136178355 +0000 UTC m=+0.077243155 container create fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:20:42 compute-0 systemd[1]: Started libpod-conmon-fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c.scope.
Jan 22 22:20:42 compute-0 podman[215422]: 2026-01-22 22:20:42.097763368 +0000 UTC m=+0.038828208 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:20:42 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:20:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/408cb713207cb90a4ad88fd48965a220ec90207688e50b730b71fbca0bfb1e9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:20:42 compute-0 podman[215422]: 2026-01-22 22:20:42.228938469 +0000 UTC m=+0.170003269 container init fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 22:20:42 compute-0 podman[215422]: 2026-01-22 22:20:42.241187277 +0000 UTC m=+0.182252057 container start fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:20:42 compute-0 neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d[215438]: [NOTICE]   (215452) : New worker (215461) forked
Jan 22 22:20:42 compute-0 neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d[215438]: [NOTICE]   (215452) : Loading success.
Jan 22 22:20:42 compute-0 podman[215437]: 2026-01-22 22:20:42.277631965 +0000 UTC m=+0.072986748 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:20:43 compute-0 nova_compute[182725]: 2026-01-22 22:20:43.576 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:43 compute-0 nova_compute[182725]: 2026-01-22 22:20:43.928 182729 DEBUG nova.compute.manager [req-b0344a39-3a6b-4466-84df-34cbbf47e015 req-3515ba89-b9d5-4236-aae3-4a0a65d17f22 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:43 compute-0 nova_compute[182725]: 2026-01-22 22:20:43.929 182729 DEBUG oslo_concurrency.lockutils [req-b0344a39-3a6b-4466-84df-34cbbf47e015 req-3515ba89-b9d5-4236-aae3-4a0a65d17f22 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:43 compute-0 nova_compute[182725]: 2026-01-22 22:20:43.929 182729 DEBUG oslo_concurrency.lockutils [req-b0344a39-3a6b-4466-84df-34cbbf47e015 req-3515ba89-b9d5-4236-aae3-4a0a65d17f22 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:43 compute-0 nova_compute[182725]: 2026-01-22 22:20:43.930 182729 DEBUG oslo_concurrency.lockutils [req-b0344a39-3a6b-4466-84df-34cbbf47e015 req-3515ba89-b9d5-4236-aae3-4a0a65d17f22 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:43 compute-0 nova_compute[182725]: 2026-01-22 22:20:43.930 182729 DEBUG nova.compute.manager [req-b0344a39-3a6b-4466-84df-34cbbf47e015 req-3515ba89-b9d5-4236-aae3-4a0a65d17f22 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] No waiting events found dispatching network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:20:43 compute-0 nova_compute[182725]: 2026-01-22 22:20:43.930 182729 WARNING nova.compute.manager [req-b0344a39-3a6b-4466-84df-34cbbf47e015 req-3515ba89-b9d5-4236-aae3-4a0a65d17f22 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received unexpected event network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f for instance with vm_state active and task_state None.
Jan 22 22:20:45 compute-0 nova_compute[182725]: 2026-01-22 22:20:45.077 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120430.0764453, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:45 compute-0 nova_compute[182725]: 2026-01-22 22:20:45.078 182729 INFO nova.compute.manager [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Stopped (Lifecycle Event)
Jan 22 22:20:45 compute-0 nova_compute[182725]: 2026-01-22 22:20:45.100 182729 DEBUG nova.compute.manager [None req-78778e93-4084-47e2-b5c1-7b4341a8a40f - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:45 compute-0 nova_compute[182725]: 2026-01-22 22:20:45.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:45 compute-0 nova_compute[182725]: 2026-01-22 22:20:45.685 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6911] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/57)
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6922] device (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <warn>  [1769120445.6923] device (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6933] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/58)
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6938] device (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <warn>  [1769120445.6939] device (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6948] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6955] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6961] device (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 22:20:45 compute-0 NetworkManager[54954]: <info>  [1769120445.6965] device (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 22:20:45 compute-0 ovn_controller[94850]: 2026-01-22T22:20:45Z|00116|binding|INFO|Releasing lport aa76405e-fde4-4806-a8b5-70eb3b070f83 from this chassis (sb_readonly=0)
Jan 22 22:20:45 compute-0 nova_compute[182725]: 2026-01-22 22:20:45.774 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:45 compute-0 nova_compute[182725]: 2026-01-22 22:20:45.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:48 compute-0 nova_compute[182725]: 2026-01-22 22:20:48.579 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:50 compute-0 nova_compute[182725]: 2026-01-22 22:20:50.540 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:51 compute-0 nova_compute[182725]: 2026-01-22 22:20:51.825 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:51 compute-0 nova_compute[182725]: 2026-01-22 22:20:51.828 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:51 compute-0 nova_compute[182725]: 2026-01-22 22:20:51.861 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:20:51 compute-0 nova_compute[182725]: 2026-01-22 22:20:51.974 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:51 compute-0 nova_compute[182725]: 2026-01-22 22:20:51.975 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:51 compute-0 nova_compute[182725]: 2026-01-22 22:20:51.988 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:20:51 compute-0 nova_compute[182725]: 2026-01-22 22:20:51.988 182729 INFO nova.compute.claims [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.124 182729 DEBUG nova.compute.manager [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-changed-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.125 182729 DEBUG nova.compute.manager [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Refreshing instance network info cache due to event network-changed-24c4cb21-9bec-4310-af64-2396a78ebb7f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.125 182729 DEBUG oslo_concurrency.lockutils [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.127 182729 DEBUG oslo_concurrency.lockutils [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.127 182729 DEBUG nova.network.neutron [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Refreshing network info cache for port 24c4cb21-9bec-4310-af64-2396a78ebb7f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.210 182729 DEBUG nova.compute.provider_tree [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.225 182729 DEBUG nova.scheduler.client.report [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.246 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.247 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.327 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.328 182729 DEBUG nova.network.neutron [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.350 182729 INFO nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.377 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.496 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.498 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.499 182729 INFO nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Creating image(s)
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.500 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "/var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.501 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "/var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.502 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "/var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.527 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.615 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.618 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.620 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.649 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.712 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.714 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.937 182729 DEBUG nova.network.neutron [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:20:52 compute-0 nova_compute[182725]: 2026-01-22 22:20:52.937 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.004 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk 1073741824" returned: 0 in 0.290s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.005 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.006 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.083 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.085 182729 DEBUG nova.virt.disk.api [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Checking if we can resize image /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.086 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.149 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.151 182729 DEBUG nova.virt.disk.api [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Cannot resize image /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.152 182729 DEBUG nova.objects.instance [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lazy-loading 'migration_context' on Instance uuid 2356e839-a7e6-49f6-b657-19f93f2ad1f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.168 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.169 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Ensure instance console log exists: /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.169 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.170 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.170 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.173 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.187 182729 WARNING nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.195 182729 DEBUG nova.virt.libvirt.host [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.195 182729 DEBUG nova.virt.libvirt.host [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.199 182729 DEBUG nova.virt.libvirt.host [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.199 182729 DEBUG nova.virt.libvirt.host [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.200 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.201 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.201 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.201 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.201 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.201 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.202 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.202 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.202 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.202 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.202 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.202 182729 DEBUG nova.virt.hardware [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.206 182729 DEBUG nova.objects.instance [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2356e839-a7e6-49f6-b657-19f93f2ad1f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.219 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <uuid>2356e839-a7e6-49f6-b657-19f93f2ad1f0</uuid>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <name>instance-00000023</name>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <nova:name>tempest-ListImageFiltersTestJSON-server-658019017</nova:name>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:20:53</nova:creationTime>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:20:53 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:20:53 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:20:53 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:20:53 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:20:53 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:20:53 compute-0 nova_compute[182725]:         <nova:user uuid="cffbe73f5e07412b97260458ca70b2b4">tempest-ListImageFiltersTestJSON-692630058-project-member</nova:user>
Jan 22 22:20:53 compute-0 nova_compute[182725]:         <nova:project uuid="9b876429c3fd4b08951c6822abfb5eb1">tempest-ListImageFiltersTestJSON-692630058</nova:project>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <system>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <entry name="serial">2356e839-a7e6-49f6-b657-19f93f2ad1f0</entry>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <entry name="uuid">2356e839-a7e6-49f6-b657-19f93f2ad1f0</entry>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </system>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <os>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   </os>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <features>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   </features>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk.config"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/console.log" append="off"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <video>
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </video>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:20:53 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:20:53 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:20:53 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:20:53 compute-0 nova_compute[182725]: </domain>
Jan 22 22:20:53 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.282 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.283 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.283 182729 INFO nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Using config drive
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.529 182729 INFO nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Creating config drive at /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk.config
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.540 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_vziba7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.581 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.678 182729 DEBUG oslo_concurrency.processutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_vziba7" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:20:53 compute-0 systemd-machined[154006]: New machine qemu-15-instance-00000023.
Jan 22 22:20:53 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000023.
Jan 22 22:20:53 compute-0 ovn_controller[94850]: 2026-01-22T22:20:53Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:28:0e 10.100.0.10
Jan 22 22:20:53 compute-0 ovn_controller[94850]: 2026-01-22T22:20:53Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:28:0e 10.100.0.10
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.937 182729 DEBUG nova.network.neutron [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updated VIF entry in instance network info cache for port 24c4cb21-9bec-4310-af64-2396a78ebb7f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.937 182729 DEBUG nova.network.neutron [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updating instance_info_cache with network_info: [{"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:20:53 compute-0 nova_compute[182725]: 2026-01-22 22:20:53.964 182729 DEBUG oslo_concurrency.lockutils [req-fc1ff957-d1ee-4b2f-b541-4c3f8a30a32d req-2d043c90-cefb-4c64-8774-41b92e9864a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:20:54 compute-0 ovn_controller[94850]: 2026-01-22T22:20:54Z|00117|binding|INFO|Releasing lport aa76405e-fde4-4806-a8b5-70eb3b070f83 from this chassis (sb_readonly=0)
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.081 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.104 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120454.1040776, 2356e839-a7e6-49f6-b657-19f93f2ad1f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.105 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] VM Resumed (Lifecycle Event)
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.109 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.110 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.116 182729 INFO nova.virt.libvirt.driver [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Instance spawned successfully.
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.117 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.143 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.155 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.162 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.163 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.164 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.165 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.166 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.167 182729 DEBUG nova.virt.libvirt.driver [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.206 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.206 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120454.10806, 2356e839-a7e6-49f6-b657-19f93f2ad1f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.207 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] VM Started (Lifecycle Event)
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.234 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.238 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.276 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.277 182729 INFO nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Took 1.78 seconds to spawn the instance on the hypervisor.
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.277 182729 DEBUG nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.370 182729 INFO nova.compute.manager [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Took 2.45 seconds to build instance.
Jan 22 22:20:54 compute-0 nova_compute[182725]: 2026-01-22 22:20:54.398 182729 DEBUG oslo_concurrency.lockutils [None req-7831080c-2b3d-41c8-8998-91d69ffb12b8 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:20:55 compute-0 nova_compute[182725]: 2026-01-22 22:20:55.545 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:56 compute-0 podman[215529]: 2026-01-22 22:20:56.169710026 +0000 UTC m=+0.088727004 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 22 22:20:58 compute-0 nova_compute[182725]: 2026-01-22 22:20:58.592 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:20:59 compute-0 podman[215549]: 2026-01-22 22:20:59.133848113 +0000 UTC m=+0.063811229 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 22:20:59 compute-0 podman[215548]: 2026-01-22 22:20:59.185473489 +0000 UTC m=+0.119728051 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:20:59 compute-0 nova_compute[182725]: 2026-01-22 22:20:59.417 182729 DEBUG nova.compute.manager [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-changed-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:20:59 compute-0 nova_compute[182725]: 2026-01-22 22:20:59.417 182729 DEBUG nova.compute.manager [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Refreshing instance network info cache due to event network-changed-24c4cb21-9bec-4310-af64-2396a78ebb7f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:20:59 compute-0 nova_compute[182725]: 2026-01-22 22:20:59.418 182729 DEBUG oslo_concurrency.lockutils [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:20:59 compute-0 nova_compute[182725]: 2026-01-22 22:20:59.419 182729 DEBUG oslo_concurrency.lockutils [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:20:59 compute-0 nova_compute[182725]: 2026-01-22 22:20:59.419 182729 DEBUG nova.network.neutron [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Refreshing network info cache for port 24c4cb21-9bec-4310-af64-2396a78ebb7f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:21:00 compute-0 nova_compute[182725]: 2026-01-22 22:21:00.553 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.808 182729 DEBUG nova.network.neutron [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updated VIF entry in instance network info cache for port 24c4cb21-9bec-4310-af64-2396a78ebb7f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.809 182729 DEBUG nova.network.neutron [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updating instance_info_cache with network_info: [{"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.856 182729 DEBUG oslo_concurrency.lockutils [req-737854da-5878-463d-a84f-158f258f8c88 req-46221d01-cd57-425d-bcda-5792615db502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.968 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.969 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.970 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.970 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.971 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:02 compute-0 nova_compute[182725]: 2026-01-22 22:21:02.985 182729 INFO nova.compute.manager [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Terminating instance
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.002 182729 DEBUG nova.compute.manager [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:21:03 compute-0 kernel: tap24c4cb21-9b (unregistering): left promiscuous mode
Jan 22 22:21:03 compute-0 NetworkManager[54954]: <info>  [1769120463.0342] device (tap24c4cb21-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:21:03 compute-0 ovn_controller[94850]: 2026-01-22T22:21:03Z|00118|binding|INFO|Releasing lport 24c4cb21-9bec-4310-af64-2396a78ebb7f from this chassis (sb_readonly=0)
Jan 22 22:21:03 compute-0 ovn_controller[94850]: 2026-01-22T22:21:03Z|00119|binding|INFO|Setting lport 24c4cb21-9bec-4310-af64-2396a78ebb7f down in Southbound
Jan 22 22:21:03 compute-0 ovn_controller[94850]: 2026-01-22T22:21:03Z|00120|binding|INFO|Removing iface tap24c4cb21-9b ovn-installed in OVS
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.048 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.058 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:28:0e 10.100.0.10'], port_security=['fa:16:3e:95:28:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-558b365e-7816-4d24-b8e7-6a29cc71889d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25f7fcbe33ce4b5fb686827d79a71058', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fcdc89f3-4a1f-4042-81e3-e273744215dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbcc9ff3-1f39-47e2-9239-70f6e6478a90, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=24c4cb21-9bec-4310-af64-2396a78ebb7f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.088 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.089 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 24c4cb21-9bec-4310-af64-2396a78ebb7f in datapath 558b365e-7816-4d24-b8e7-6a29cc71889d unbound from our chassis
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.090 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 558b365e-7816-4d24-b8e7-6a29cc71889d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.092 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[91a8c321-7d09-4299-afad-3aafe3b96fbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.093 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d namespace which is not needed anymore
Jan 22 22:21:03 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 22 22:21:03 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Consumed 12.647s CPU time.
Jan 22 22:21:03 compute-0 systemd-machined[154006]: Machine qemu-14-instance-00000020 terminated.
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.256 182729 DEBUG nova.compute.manager [req-a670c744-14eb-458d-9495-8231beee9030 req-79358ce3-ec31-411f-87c5-db6ef5bd7cc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-vif-unplugged-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.257 182729 DEBUG oslo_concurrency.lockutils [req-a670c744-14eb-458d-9495-8231beee9030 req-79358ce3-ec31-411f-87c5-db6ef5bd7cc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:03 compute-0 neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d[215438]: [NOTICE]   (215452) : haproxy version is 2.8.14-c23fe91
Jan 22 22:21:03 compute-0 neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d[215438]: [NOTICE]   (215452) : path to executable is /usr/sbin/haproxy
Jan 22 22:21:03 compute-0 neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d[215438]: [WARNING]  (215452) : Exiting Master process...
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.258 182729 DEBUG oslo_concurrency.lockutils [req-a670c744-14eb-458d-9495-8231beee9030 req-79358ce3-ec31-411f-87c5-db6ef5bd7cc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.259 182729 DEBUG oslo_concurrency.lockutils [req-a670c744-14eb-458d-9495-8231beee9030 req-79358ce3-ec31-411f-87c5-db6ef5bd7cc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.259 182729 DEBUG nova.compute.manager [req-a670c744-14eb-458d-9495-8231beee9030 req-79358ce3-ec31-411f-87c5-db6ef5bd7cc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] No waiting events found dispatching network-vif-unplugged-24c4cb21-9bec-4310-af64-2396a78ebb7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.260 182729 DEBUG nova.compute.manager [req-a670c744-14eb-458d-9495-8231beee9030 req-79358ce3-ec31-411f-87c5-db6ef5bd7cc5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-vif-unplugged-24c4cb21-9bec-4310-af64-2396a78ebb7f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:21:03 compute-0 neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d[215438]: [ALERT]    (215452) : Current worker (215461) exited with code 143 (Terminated)
Jan 22 22:21:03 compute-0 neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d[215438]: [WARNING]  (215452) : All workers exited. Exiting... (0)
Jan 22 22:21:03 compute-0 systemd[1]: libpod-fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c.scope: Deactivated successfully.
Jan 22 22:21:03 compute-0 podman[215618]: 2026-01-22 22:21:03.272264304 +0000 UTC m=+0.066868614 container died fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.298 182729 INFO nova.virt.libvirt.driver [-] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Instance destroyed successfully.
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.300 182729 DEBUG nova.objects.instance [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lazy-loading 'resources' on Instance uuid 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c-userdata-shm.mount: Deactivated successfully.
Jan 22 22:21:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-408cb713207cb90a4ad88fd48965a220ec90207688e50b730b71fbca0bfb1e9a-merged.mount: Deactivated successfully.
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.334 182729 DEBUG nova.virt.libvirt.vif [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:20:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-278163831',display_name='tempest-FloatingIPsAssociationTestJSON-server-278163831',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-278163831',id=32,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:20:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='25f7fcbe33ce4b5fb686827d79a71058',ramdisk_id='',reservation_id='r-dhf2apm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1815924143',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1815924143-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:20:41Z,user_data=None,user_id='7de9ef653dde4c0e8525c15bc52dc809',uuid=4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.335 182729 DEBUG nova.network.os_vif_util [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Converting VIF {"id": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "address": "fa:16:3e:95:28:0e", "network": {"id": "558b365e-7816-4d24-b8e7-6a29cc71889d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-839918057-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25f7fcbe33ce4b5fb686827d79a71058", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap24c4cb21-9b", "ovs_interfaceid": "24c4cb21-9bec-4310-af64-2396a78ebb7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.336 182729 DEBUG nova.network.os_vif_util [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:28:0e,bridge_name='br-int',has_traffic_filtering=True,id=24c4cb21-9bec-4310-af64-2396a78ebb7f,network=Network(558b365e-7816-4d24-b8e7-6a29cc71889d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24c4cb21-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.336 182729 DEBUG os_vif [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:28:0e,bridge_name='br-int',has_traffic_filtering=True,id=24c4cb21-9bec-4310-af64-2396a78ebb7f,network=Network(558b365e-7816-4d24-b8e7-6a29cc71889d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24c4cb21-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:21:03 compute-0 podman[215618]: 2026-01-22 22:21:03.337744833 +0000 UTC m=+0.132349103 container cleanup fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.339 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c4cb21-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.346 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.350 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:21:03 compute-0 systemd[1]: libpod-conmon-fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c.scope: Deactivated successfully.
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.354 182729 INFO os_vif [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:28:0e,bridge_name='br-int',has_traffic_filtering=True,id=24c4cb21-9bec-4310-af64-2396a78ebb7f,network=Network(558b365e-7816-4d24-b8e7-6a29cc71889d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap24c4cb21-9b')
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.355 182729 INFO nova.virt.libvirt.driver [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Deleting instance files /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9_del
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.356 182729 INFO nova.virt.libvirt.driver [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Deletion of /var/lib/nova/instances/4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9_del complete
Jan 22 22:21:03 compute-0 podman[215663]: 2026-01-22 22:21:03.421877193 +0000 UTC m=+0.049792932 container remove fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.429 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[28eee73f-05a8-4502-b3ed-d4072b53de99]: (4, ('Thu Jan 22 10:21:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d (fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c)\nfb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c\nThu Jan 22 10:21:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d (fb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c)\nfb873f430af32afa50d83e0d4f2d9869336e6edc24fb9c4e3856a1337315fc3c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.431 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[483a2733-11cb-424e-a363-15adef0f7371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.433 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap558b365e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:21:03 compute-0 kernel: tap558b365e-70: left promiscuous mode
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.437 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.445 182729 INFO nova.compute.manager [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.446 182729 DEBUG oslo.service.loopingcall [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.446 182729 DEBUG nova.compute.manager [-] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.447 182729 DEBUG nova.network.neutron [-] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.452 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.454 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7732a868-671f-4c27-945f-cbdcb7cff38a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.467 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7cf1b2-dbbc-4cd4-8c0a-fbe0b17a3ea8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.468 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ab32d2-5acb-46a9-88b4-d51a62928910]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.493 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[59ad3537-1b59-4dd0-bdfe-1647a0444458]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 407904, 'reachable_time': 25786, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215676, 'error': None, 'target': 'ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d558b365e\x2d7816\x2d4d24\x2db8e7\x2d6a29cc71889d.mount: Deactivated successfully.
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.500 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-558b365e-7816-4d24-b8e7-6a29cc71889d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:21:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:03.501 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[5d083899-fbc3-4e36-81b1-47a5b9072f73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:21:03 compute-0 nova_compute[182725]: 2026-01-22 22:21:03.589 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.063 182729 DEBUG nova.network.neutron [-] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.083 182729 INFO nova.compute.manager [-] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Took 0.64 seconds to deallocate network for instance.
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.193 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.194 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.317 182729 DEBUG nova.compute.provider_tree [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.337 182729 DEBUG nova.scheduler.client.report [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.360 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.388 182729 INFO nova.scheduler.client.report [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Deleted allocations for instance 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.473 182729 DEBUG oslo_concurrency.lockutils [None req-b0ca46fe-b2ed-418b-84c3-0d0040223780 7de9ef653dde4c0e8525c15bc52dc809 25f7fcbe33ce4b5fb686827d79a71058 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:04 compute-0 nova_compute[182725]: 2026-01-22 22:21:04.494 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.274 182729 DEBUG nova.compute.manager [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.327 182729 INFO nova.compute.manager [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] instance snapshotting
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.449 182729 DEBUG nova.compute.manager [req-75b66727-1f0e-4ccc-af7c-d92fa134461a req-6ee8b04f-1418-4059-b376-3b91db1ffcde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.449 182729 DEBUG oslo_concurrency.lockutils [req-75b66727-1f0e-4ccc-af7c-d92fa134461a req-6ee8b04f-1418-4059-b376-3b91db1ffcde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.450 182729 DEBUG oslo_concurrency.lockutils [req-75b66727-1f0e-4ccc-af7c-d92fa134461a req-6ee8b04f-1418-4059-b376-3b91db1ffcde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.451 182729 DEBUG oslo_concurrency.lockutils [req-75b66727-1f0e-4ccc-af7c-d92fa134461a req-6ee8b04f-1418-4059-b376-3b91db1ffcde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.451 182729 DEBUG nova.compute.manager [req-75b66727-1f0e-4ccc-af7c-d92fa134461a req-6ee8b04f-1418-4059-b376-3b91db1ffcde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] No waiting events found dispatching network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.451 182729 WARNING nova.compute.manager [req-75b66727-1f0e-4ccc-af7c-d92fa134461a req-6ee8b04f-1418-4059-b376-3b91db1ffcde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received unexpected event network-vif-plugged-24c4cb21-9bec-4310-af64-2396a78ebb7f for instance with vm_state deleted and task_state None.
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.452 182729 DEBUG nova.compute.manager [req-75b66727-1f0e-4ccc-af7c-d92fa134461a req-6ee8b04f-1418-4059-b376-3b91db1ffcde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Received event network-vif-deleted-24c4cb21-9bec-4310-af64-2396a78ebb7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:21:05 compute-0 nova_compute[182725]: 2026-01-22 22:21:05.615 182729 INFO nova.virt.libvirt.driver [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Beginning live snapshot process
Jan 22 22:21:06 compute-0 virtqemud[182297]: invalid argument: disk vda does not have an active block job
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.033 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.128 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json -f qcow2" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.129 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.221 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json -f qcow2" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.249 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.321 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.323 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpj3myjwd6/1304025fc1eb4ed9a0f81f512e19d451.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.371 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpj3myjwd6/1304025fc1eb4ed9a0f81f512e19d451.delta 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.373 182729 INFO nova.virt.libvirt.driver [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.437 182729 DEBUG nova.virt.libvirt.guest [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.942 182729 DEBUG nova.virt.libvirt.guest [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 22:21:06 compute-0 nova_compute[182725]: 2026-01-22 22:21:06.947 182729 INFO nova.virt.libvirt.driver [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 22:21:07 compute-0 nova_compute[182725]: 2026-01-22 22:21:07.010 182729 DEBUG nova.privsep.utils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:21:07 compute-0 nova_compute[182725]: 2026-01-22 22:21:07.011 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpj3myjwd6/1304025fc1eb4ed9a0f81f512e19d451.delta /var/lib/nova/instances/snapshots/tmpj3myjwd6/1304025fc1eb4ed9a0f81f512e19d451 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:07 compute-0 nova_compute[182725]: 2026-01-22 22:21:07.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:07 compute-0 nova_compute[182725]: 2026-01-22 22:21:07.455 182729 DEBUG oslo_concurrency.processutils [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpj3myjwd6/1304025fc1eb4ed9a0f81f512e19d451.delta /var/lib/nova/instances/snapshots/tmpj3myjwd6/1304025fc1eb4ed9a0f81f512e19d451" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:07 compute-0 nova_compute[182725]: 2026-01-22 22:21:07.466 182729 INFO nova.virt.libvirt.driver [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Snapshot extracted, beginning image upload
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.348 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.594 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.912 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.913 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.939 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.940 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.941 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:08 compute-0 nova_compute[182725]: 2026-01-22 22:21:08.941 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.018 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.096 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.098 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.159 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.424 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.426 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5476MB free_disk=73.29980087280273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.426 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.427 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.520 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 2356e839-a7e6-49f6-b657-19f93f2ad1f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.521 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.521 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:21:09 compute-0 nova_compute[182725]: 2026-01-22 22:21:09.582 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:21:10 compute-0 podman[215736]: 2026-01-22 22:21:10.152753185 +0000 UTC m=+0.074778910 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:21:10 compute-0 podman[215737]: 2026-01-22 22:21:10.164318951 +0000 UTC m=+0.086044539 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:21:10 compute-0 nova_compute[182725]: 2026-01-22 22:21:10.522 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:21:10 compute-0 nova_compute[182725]: 2026-01-22 22:21:10.550 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:21:10 compute-0 nova_compute[182725]: 2026-01-22 22:21:10.551 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:11 compute-0 nova_compute[182725]: 2026-01-22 22:21:11.096 182729 INFO nova.virt.libvirt.driver [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Snapshot image upload complete
Jan 22 22:21:11 compute-0 nova_compute[182725]: 2026-01-22 22:21:11.097 182729 INFO nova.compute.manager [None req-c36d51f1-fbc3-4698-868f-e4fe7a8aa738 cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Took 5.76 seconds to snapshot the instance on the hypervisor.
Jan 22 22:21:11 compute-0 nova_compute[182725]: 2026-01-22 22:21:11.526 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:11 compute-0 nova_compute[182725]: 2026-01-22 22:21:11.527 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:11 compute-0 nova_compute[182725]: 2026-01-22 22:21:11.527 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:11 compute-0 nova_compute[182725]: 2026-01-22 22:21:11.528 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:21:11 compute-0 nova_compute[182725]: 2026-01-22 22:21:11.885 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:12.429 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:12.430 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:12.430 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:13 compute-0 podman[215779]: 2026-01-22 22:21:13.165833885 +0000 UTC m=+0.083951426 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:21:13 compute-0 nova_compute[182725]: 2026-01-22 22:21:13.352 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:13 compute-0 nova_compute[182725]: 2026-01-22 22:21:13.597 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:13 compute-0 nova_compute[182725]: 2026-01-22 22:21:13.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:13 compute-0 nova_compute[182725]: 2026-01-22 22:21:13.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:14 compute-0 nova_compute[182725]: 2026-01-22 22:21:14.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:21:18 compute-0 nova_compute[182725]: 2026-01-22 22:21:18.297 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120463.295556, 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:21:18 compute-0 nova_compute[182725]: 2026-01-22 22:21:18.298 182729 INFO nova.compute.manager [-] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] VM Stopped (Lifecycle Event)
Jan 22 22:21:18 compute-0 nova_compute[182725]: 2026-01-22 22:21:18.340 182729 DEBUG nova.compute.manager [None req-85f08bdd-fe5c-40dc-81c5-954a20ffd7b3 - - - - - -] [instance: 4583e9fe-ae40-48c4-b9e9-5baaf8fa76f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:18 compute-0 nova_compute[182725]: 2026-01-22 22:21:18.355 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:18 compute-0 nova_compute[182725]: 2026-01-22 22:21:18.599 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:20 compute-0 nova_compute[182725]: 2026-01-22 22:21:20.491 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:20 compute-0 nova_compute[182725]: 2026-01-22 22:21:20.593 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.508 182729 DEBUG nova.compute.manager [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.620 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.621 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.660 182729 DEBUG nova.objects.instance [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'pci_requests' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.680 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.681 182729 INFO nova.compute.claims [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.681 182729 DEBUG nova.objects.instance [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'resources' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.695 182729 DEBUG nova.objects.instance [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'numa_topology' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.709 182729 DEBUG nova.objects.instance [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'pci_devices' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.771 182729 INFO nova.compute.resource_tracker [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updating resource usage from migration 4ec7d364-40db-44a4-8d88-b751eac5fa23
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.771 182729 DEBUG nova.compute.resource_tracker [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Starting to track incoming migration 4ec7d364-40db-44a4-8d88-b751eac5fa23 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.874 182729 DEBUG nova.compute.provider_tree [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.890 182729 DEBUG nova.scheduler.client.report [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.911 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:22 compute-0 nova_compute[182725]: 2026-01-22 22:21:22.912 182729 INFO nova.compute.manager [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Migrating
Jan 22 22:21:23 compute-0 nova_compute[182725]: 2026-01-22 22:21:23.357 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:23 compute-0 nova_compute[182725]: 2026-01-22 22:21:23.601 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:24 compute-0 sshd-session[215804]: Accepted publickey for nova from 192.168.122.102 port 38776 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:21:24 compute-0 systemd-logind[801]: New session 34 of user nova.
Jan 22 22:21:24 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 22:21:24 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 22:21:24 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 22:21:24 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 22:21:24 compute-0 systemd[215808]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:21:24 compute-0 systemd[215808]: Queued start job for default target Main User Target.
Jan 22 22:21:24 compute-0 systemd[215808]: Created slice User Application Slice.
Jan 22 22:21:24 compute-0 systemd[215808]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:21:24 compute-0 systemd[215808]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:21:24 compute-0 systemd[215808]: Reached target Paths.
Jan 22 22:21:24 compute-0 systemd[215808]: Reached target Timers.
Jan 22 22:21:24 compute-0 systemd[215808]: Starting D-Bus User Message Bus Socket...
Jan 22 22:21:24 compute-0 systemd[215808]: Starting Create User's Volatile Files and Directories...
Jan 22 22:21:24 compute-0 systemd[215808]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:21:24 compute-0 systemd[215808]: Reached target Sockets.
Jan 22 22:21:24 compute-0 systemd[215808]: Finished Create User's Volatile Files and Directories.
Jan 22 22:21:24 compute-0 systemd[215808]: Reached target Basic System.
Jan 22 22:21:24 compute-0 systemd[215808]: Reached target Main User Target.
Jan 22 22:21:24 compute-0 systemd[215808]: Startup finished in 148ms.
Jan 22 22:21:24 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 22:21:24 compute-0 systemd[1]: Started Session 34 of User nova.
Jan 22 22:21:24 compute-0 sshd-session[215804]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:21:24 compute-0 sshd-session[215823]: Received disconnect from 192.168.122.102 port 38776:11: disconnected by user
Jan 22 22:21:24 compute-0 sshd-session[215823]: Disconnected from user nova 192.168.122.102 port 38776
Jan 22 22:21:24 compute-0 sshd-session[215804]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:21:24 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Jan 22 22:21:24 compute-0 systemd-logind[801]: Session 34 logged out. Waiting for processes to exit.
Jan 22 22:21:24 compute-0 systemd-logind[801]: Removed session 34.
Jan 22 22:21:24 compute-0 sshd-session[215825]: Accepted publickey for nova from 192.168.122.102 port 38778 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:21:24 compute-0 systemd-logind[801]: New session 36 of user nova.
Jan 22 22:21:24 compute-0 systemd[1]: Started Session 36 of User nova.
Jan 22 22:21:24 compute-0 sshd-session[215825]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:21:24 compute-0 sshd-session[215828]: Received disconnect from 192.168.122.102 port 38778:11: disconnected by user
Jan 22 22:21:24 compute-0 sshd-session[215828]: Disconnected from user nova 192.168.122.102 port 38778
Jan 22 22:21:24 compute-0 sshd-session[215825]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:21:24 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Jan 22 22:21:24 compute-0 systemd-logind[801]: Session 36 logged out. Waiting for processes to exit.
Jan 22 22:21:24 compute-0 systemd-logind[801]: Removed session 36.
Jan 22 22:21:27 compute-0 podman[215830]: 2026-01-22 22:21:27.178138412 +0000 UTC m=+0.097944863 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:21:28 compute-0 nova_compute[182725]: 2026-01-22 22:21:28.359 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:28 compute-0 nova_compute[182725]: 2026-01-22 22:21:28.602 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:30 compute-0 podman[215851]: 2026-01-22 22:21:30.187969161 +0000 UTC m=+0.098256340 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, config_id=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 22:21:30 compute-0 podman[215850]: 2026-01-22 22:21:30.228600236 +0000 UTC m=+0.149349394 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.059 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.060 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.061 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.062 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.062 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.079 182729 INFO nova.compute.manager [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Terminating instance
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.096 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "refresh_cache-2356e839-a7e6-49f6-b657-19f93f2ad1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.096 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquired lock "refresh_cache-2356e839-a7e6-49f6-b657-19f93f2ad1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.097 182729 DEBUG nova.network.neutron [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.313 182729 DEBUG nova.network.neutron [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:21:31 compute-0 nova_compute[182725]: 2026-01-22 22:21:31.995 182729 DEBUG nova.network.neutron [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.028 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Releasing lock "refresh_cache-2356e839-a7e6-49f6-b657-19f93f2ad1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.028 182729 DEBUG nova.compute.manager [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:21:32 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 22 22:21:32 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000023.scope: Consumed 13.227s CPU time.
Jan 22 22:21:32 compute-0 systemd-machined[154006]: Machine qemu-15-instance-00000023 terminated.
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.308 182729 INFO nova.virt.libvirt.driver [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Instance destroyed successfully.
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.309 182729 DEBUG nova.objects.instance [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lazy-loading 'resources' on Instance uuid 2356e839-a7e6-49f6-b657-19f93f2ad1f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.321 182729 INFO nova.virt.libvirt.driver [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Deleting instance files /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0_del
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.323 182729 INFO nova.virt.libvirt.driver [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Deletion of /var/lib/nova/instances/2356e839-a7e6-49f6-b657-19f93f2ad1f0_del complete
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.424 182729 INFO nova.compute.manager [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.425 182729 DEBUG oslo.service.loopingcall [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.425 182729 DEBUG nova.compute.manager [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.426 182729 DEBUG nova.network.neutron [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.534 182729 DEBUG nova.network.neutron [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.549 182729 DEBUG nova.network.neutron [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.564 182729 INFO nova.compute.manager [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Took 0.14 seconds to deallocate network for instance.
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.630 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.631 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.722 182729 DEBUG nova.compute.provider_tree [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.753 182729 DEBUG nova.scheduler.client.report [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.791 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.867 182729 INFO nova.scheduler.client.report [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Deleted allocations for instance 2356e839-a7e6-49f6-b657-19f93f2ad1f0
Jan 22 22:21:32 compute-0 nova_compute[182725]: 2026-01-22 22:21:32.954 182729 DEBUG oslo_concurrency.lockutils [None req-4d0f7e38-f996-4bfc-bc29-b1a266b5591c cffbe73f5e07412b97260458ca70b2b4 9b876429c3fd4b08951c6822abfb5eb1 - - default default] Lock "2356e839-a7e6-49f6-b657-19f93f2ad1f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:33 compute-0 nova_compute[182725]: 2026-01-22 22:21:33.363 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:33 compute-0 nova_compute[182725]: 2026-01-22 22:21:33.605 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:34 compute-0 nova_compute[182725]: 2026-01-22 22:21:34.217 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:34.217 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:21:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:34.219 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:21:35 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 22:21:35 compute-0 systemd[215808]: Activating special unit Exit the Session...
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped target Main User Target.
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped target Basic System.
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped target Paths.
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped target Sockets.
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped target Timers.
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:21:35 compute-0 systemd[215808]: Closed D-Bus User Message Bus Socket.
Jan 22 22:21:35 compute-0 systemd[215808]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:21:35 compute-0 systemd[215808]: Removed slice User Application Slice.
Jan 22 22:21:35 compute-0 systemd[215808]: Reached target Shutdown.
Jan 22 22:21:35 compute-0 systemd[215808]: Finished Exit the Session.
Jan 22 22:21:35 compute-0 systemd[215808]: Reached target Exit the Session.
Jan 22 22:21:35 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 22:21:35 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 22:21:35 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 22:21:35 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 22:21:35 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 22:21:35 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 22:21:35 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 22:21:36 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 22:21:38 compute-0 sshd-session[215908]: Accepted publickey for nova from 192.168.122.102 port 50500 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:21:38 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 22:21:38 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 22:21:38 compute-0 systemd-logind[801]: New session 37 of user nova.
Jan 22 22:21:38 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 22:21:38 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 22:21:38 compute-0 systemd[215912]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:21:38 compute-0 nova_compute[182725]: 2026-01-22 22:21:38.365 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:38 compute-0 systemd[215912]: Queued start job for default target Main User Target.
Jan 22 22:21:38 compute-0 systemd[215912]: Created slice User Application Slice.
Jan 22 22:21:38 compute-0 systemd[215912]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:21:38 compute-0 systemd[215912]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:21:38 compute-0 systemd[215912]: Reached target Paths.
Jan 22 22:21:38 compute-0 systemd[215912]: Reached target Timers.
Jan 22 22:21:38 compute-0 systemd[215912]: Starting D-Bus User Message Bus Socket...
Jan 22 22:21:38 compute-0 systemd[215912]: Starting Create User's Volatile Files and Directories...
Jan 22 22:21:38 compute-0 systemd[215912]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:21:38 compute-0 systemd[215912]: Reached target Sockets.
Jan 22 22:21:38 compute-0 systemd[215912]: Finished Create User's Volatile Files and Directories.
Jan 22 22:21:38 compute-0 systemd[215912]: Reached target Basic System.
Jan 22 22:21:38 compute-0 systemd[215912]: Reached target Main User Target.
Jan 22 22:21:38 compute-0 systemd[215912]: Startup finished in 168ms.
Jan 22 22:21:38 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 22:21:38 compute-0 systemd[1]: Started Session 37 of User nova.
Jan 22 22:21:38 compute-0 sshd-session[215908]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:21:38 compute-0 nova_compute[182725]: 2026-01-22 22:21:38.607 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:39 compute-0 sshd-session[215927]: Received disconnect from 192.168.122.102 port 50500:11: disconnected by user
Jan 22 22:21:39 compute-0 sshd-session[215927]: Disconnected from user nova 192.168.122.102 port 50500
Jan 22 22:21:39 compute-0 sshd-session[215908]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:21:39 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Jan 22 22:21:39 compute-0 systemd-logind[801]: Session 37 logged out. Waiting for processes to exit.
Jan 22 22:21:39 compute-0 systemd-logind[801]: Removed session 37.
Jan 22 22:21:39 compute-0 sshd-session[215929]: Accepted publickey for nova from 192.168.122.102 port 50504 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:21:39 compute-0 systemd-logind[801]: New session 39 of user nova.
Jan 22 22:21:39 compute-0 systemd[1]: Started Session 39 of User nova.
Jan 22 22:21:39 compute-0 sshd-session[215929]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:21:39 compute-0 sshd-session[215932]: Received disconnect from 192.168.122.102 port 50504:11: disconnected by user
Jan 22 22:21:39 compute-0 sshd-session[215932]: Disconnected from user nova 192.168.122.102 port 50504
Jan 22 22:21:39 compute-0 sshd-session[215929]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:21:39 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Jan 22 22:21:39 compute-0 systemd-logind[801]: Session 39 logged out. Waiting for processes to exit.
Jan 22 22:21:39 compute-0 systemd-logind[801]: Removed session 39.
Jan 22 22:21:39 compute-0 sshd-session[215934]: Accepted publickey for nova from 192.168.122.102 port 50510 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:21:39 compute-0 systemd-logind[801]: New session 40 of user nova.
Jan 22 22:21:39 compute-0 systemd[1]: Started Session 40 of User nova.
Jan 22 22:21:39 compute-0 sshd-session[215934]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:21:39 compute-0 sshd-session[215937]: Received disconnect from 192.168.122.102 port 50510:11: disconnected by user
Jan 22 22:21:39 compute-0 sshd-session[215937]: Disconnected from user nova 192.168.122.102 port 50510
Jan 22 22:21:39 compute-0 sshd-session[215934]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:21:39 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Jan 22 22:21:39 compute-0 systemd-logind[801]: Session 40 logged out. Waiting for processes to exit.
Jan 22 22:21:39 compute-0 systemd-logind[801]: Removed session 40.
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.001 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.003 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquired lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.003 182729 DEBUG nova.network.neutron [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.272 182729 DEBUG nova.network.neutron [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.593 182729 DEBUG nova.network.neutron [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.615 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Releasing lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.775 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.776 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.777 182729 INFO nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Creating image(s)
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.777 182729 DEBUG nova.objects.instance [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.807 182729 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.894 182729 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.895 182729 DEBUG nova.virt.disk.api [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Checking if we can resize image /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.895 182729 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.963 182729 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.964 182729 DEBUG nova.virt.disk.api [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Cannot resize image /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.984 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.984 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Ensure instance console log exists: /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.985 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.985 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.985 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.987 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:21:40 compute-0 nova_compute[182725]: 2026-01-22 22:21:40.994 182729 WARNING nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.000 182729 DEBUG nova.virt.libvirt.host [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.000 182729 DEBUG nova.virt.libvirt.host [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.006 182729 DEBUG nova.virt.libvirt.host [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.006 182729 DEBUG nova.virt.libvirt.host [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.007 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.008 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.008 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.008 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.008 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.009 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.009 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.009 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.009 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.010 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.010 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.010 182729 DEBUG nova.virt.hardware [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.010 182729 DEBUG nova.objects.instance [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.025 182729 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.089 182729 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.090 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.090 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.091 182729 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.093 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <uuid>79166459-7b8b-44ed-8dba-0ba4cb9d97ff</uuid>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <name>instance-00000025</name>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <nova:name>tempest-MigrationsAdminTest-server-1393496925</nova:name>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:21:40</nova:creationTime>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:21:41 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:21:41 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:21:41 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:21:41 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:21:41 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:21:41 compute-0 nova_compute[182725]:         <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 22:21:41 compute-0 nova_compute[182725]:         <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <system>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <entry name="serial">79166459-7b8b-44ed-8dba-0ba4cb9d97ff</entry>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <entry name="uuid">79166459-7b8b-44ed-8dba-0ba4cb9d97ff</entry>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </system>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <os>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   </os>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <features>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   </features>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/console.log" append="off"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <video>
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </video>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:21:41 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:21:41 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:21:41 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:21:41 compute-0 nova_compute[182725]: </domain>
Jan 22 22:21:41 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:21:41 compute-0 podman[215947]: 2026-01-22 22:21:41.135041748 +0000 UTC m=+0.066151906 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 22:21:41 compute-0 podman[215948]: 2026-01-22 22:21:41.135769476 +0000 UTC m=+0.065259234 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.156 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.157 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.157 182729 INFO nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Using config drive
Jan 22 22:21:41 compute-0 systemd-machined[154006]: New machine qemu-16-instance-00000025.
Jan 22 22:21:41 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000025.
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.525 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120501.5246832, 79166459-7b8b-44ed-8dba-0ba4cb9d97ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.525 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] VM Resumed (Lifecycle Event)
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.527 182729 DEBUG nova.compute.manager [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.532 182729 INFO nova.virt.libvirt.driver [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance running successfully.
Jan 22 22:21:41 compute-0 virtqemud[182297]: argument unsupported: QEMU guest agent is not configured
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.534 182729 DEBUG nova.virt.libvirt.guest [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.535 182729 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.556 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.560 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.578 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.579 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120501.5249379, 79166459-7b8b-44ed-8dba-0ba4cb9d97ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.579 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] VM Started (Lifecycle Event)
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.627 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:41 compute-0 nova_compute[182725]: 2026-01-22 22:21:41.633 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:21:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:21:43.221 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:21:43 compute-0 nova_compute[182725]: 2026-01-22 22:21:43.369 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:43 compute-0 nova_compute[182725]: 2026-01-22 22:21:43.608 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:44 compute-0 podman[216015]: 2026-01-22 22:21:44.141127966 +0000 UTC m=+0.067545951 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:21:47 compute-0 nova_compute[182725]: 2026-01-22 22:21:47.306 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120492.3055532, 2356e839-a7e6-49f6-b657-19f93f2ad1f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:21:47 compute-0 nova_compute[182725]: 2026-01-22 22:21:47.310 182729 INFO nova.compute.manager [-] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] VM Stopped (Lifecycle Event)
Jan 22 22:21:47 compute-0 nova_compute[182725]: 2026-01-22 22:21:47.330 182729 DEBUG nova.compute.manager [None req-2e099055-92d7-40fa-ae71-a16bd79d9295 - - - - - -] [instance: 2356e839-a7e6-49f6-b657-19f93f2ad1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:48 compute-0 nova_compute[182725]: 2026-01-22 22:21:48.371 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:48 compute-0 nova_compute[182725]: 2026-01-22 22:21:48.610 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:49 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 22:21:49 compute-0 systemd[215912]: Activating special unit Exit the Session...
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped target Main User Target.
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped target Basic System.
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped target Paths.
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped target Sockets.
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped target Timers.
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:21:49 compute-0 systemd[215912]: Closed D-Bus User Message Bus Socket.
Jan 22 22:21:49 compute-0 systemd[215912]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:21:49 compute-0 systemd[215912]: Removed slice User Application Slice.
Jan 22 22:21:49 compute-0 systemd[215912]: Reached target Shutdown.
Jan 22 22:21:49 compute-0 systemd[215912]: Finished Exit the Session.
Jan 22 22:21:49 compute-0 systemd[215912]: Reached target Exit the Session.
Jan 22 22:21:49 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 22:21:49 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 22:21:49 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 22:21:49 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 22:21:49 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 22:21:49 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 22:21:49 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.109 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.113 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.138 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.238 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.239 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.246 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.247 182729 INFO nova.compute.claims [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.392 182729 DEBUG nova.compute.provider_tree [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.410 182729 DEBUG nova.scheduler.client.report [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.431 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.432 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.475 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.476 182729 DEBUG nova.network.neutron [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.493 182729 INFO nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.514 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.625 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.627 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.627 182729 INFO nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Creating image(s)
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.628 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.628 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.628 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.641 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.701 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.704 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.705 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.715 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.775 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.776 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.816 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.817 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.818 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.882 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.884 182729 DEBUG nova.virt.disk.api [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Checking if we can resize image /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.884 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.955 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.958 182729 DEBUG nova.virt.disk.api [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Cannot resize image /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.959 182729 DEBUG nova.objects.instance [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'migration_context' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.965 182729 DEBUG nova.network.neutron [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.965 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.980 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.981 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Ensure instance console log exists: /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.981 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.982 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.982 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.983 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.988 182729 WARNING nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.994 182729 DEBUG nova.virt.libvirt.host [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:21:50 compute-0 nova_compute[182725]: 2026-01-22 22:21:50.995 182729 DEBUG nova.virt.libvirt.host [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.001 182729 DEBUG nova.virt.libvirt.host [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.002 182729 DEBUG nova.virt.libvirt.host [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.005 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.005 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.006 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.006 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.007 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.007 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.008 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.008 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.008 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.009 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.009 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.010 182729 DEBUG nova.virt.hardware [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.016 182729 DEBUG nova.objects.instance [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'pci_devices' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.031 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <uuid>ce913c81-c8b7-4b71-91b0-ec941d59dc1c</uuid>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <name>instance-00000027</name>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <nova:name>tempest-MigrationsAdminTest-server-1151892985</nova:name>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:21:50</nova:creationTime>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:21:51 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:21:51 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:21:51 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:21:51 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:21:51 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:21:51 compute-0 nova_compute[182725]:         <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 22:21:51 compute-0 nova_compute[182725]:         <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <system>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <entry name="serial">ce913c81-c8b7-4b71-91b0-ec941d59dc1c</entry>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <entry name="uuid">ce913c81-c8b7-4b71-91b0-ec941d59dc1c</entry>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </system>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <os>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   </os>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <features>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   </features>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/console.log" append="off"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <video>
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </video>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:21:51 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:21:51 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:21:51 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:21:51 compute-0 nova_compute[182725]: </domain>
Jan 22 22:21:51 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.081 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.082 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.083 182729 INFO nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Using config drive
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.364 182729 INFO nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Creating config drive at /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.370 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ee7qt8_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.501 182729 DEBUG oslo_concurrency.processutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ee7qt8_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:51 compute-0 systemd-machined[154006]: New machine qemu-17-instance-00000027.
Jan 22 22:21:51 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000027.
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.896 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120511.8963351, ce913c81-c8b7-4b71-91b0-ec941d59dc1c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.899 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] VM Resumed (Lifecycle Event)
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.904 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.905 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.910 182729 INFO nova.virt.libvirt.driver [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance spawned successfully.
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.911 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.968 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.974 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.977 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.977 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.978 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.978 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.979 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:21:51 compute-0 nova_compute[182725]: 2026-01-22 22:21:51.979 182729 DEBUG nova.virt.libvirt.driver [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.002 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.002 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120511.9038336, ce913c81-c8b7-4b71-91b0-ec941d59dc1c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.002 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] VM Started (Lifecycle Event)
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.034 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.039 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.053 182729 INFO nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Took 1.43 seconds to spawn the instance on the hypervisor.
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.053 182729 DEBUG nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.061 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.163 182729 INFO nova.compute.manager [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Took 1.96 seconds to build instance.
Jan 22 22:21:52 compute-0 nova_compute[182725]: 2026-01-22 22:21:52.187 182729 DEBUG oslo_concurrency.lockutils [None req-e7960a5b-3e5f-4b62-82bf-4deea948c818 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:21:53 compute-0 nova_compute[182725]: 2026-01-22 22:21:53.376 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:53 compute-0 nova_compute[182725]: 2026-01-22 22:21:53.612 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.089 182729 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.090 182729 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.090 182729 DEBUG nova.network.neutron [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.219 182729 DEBUG nova.network.neutron [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.576 182729 DEBUG nova.network.neutron [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.596 182729 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.883 182729 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.884 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Creating file /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/c6ea4449f44a45e1bf00efd6b3aaf831.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 22:21:55 compute-0 nova_compute[182725]: 2026-01-22 22:21:55.885 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/c6ea4449f44a45e1bf00efd6b3aaf831.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:56 compute-0 nova_compute[182725]: 2026-01-22 22:21:56.370 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/c6ea4449f44a45e1bf00efd6b3aaf831.tmp" returned: 1 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:56 compute-0 nova_compute[182725]: 2026-01-22 22:21:56.372 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/c6ea4449f44a45e1bf00efd6b3aaf831.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 22:21:56 compute-0 nova_compute[182725]: 2026-01-22 22:21:56.373 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Creating directory /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 22:21:56 compute-0 nova_compute[182725]: 2026-01-22 22:21:56.373 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:21:56 compute-0 nova_compute[182725]: 2026-01-22 22:21:56.615 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:21:56 compute-0 nova_compute[182725]: 2026-01-22 22:21:56.621 182729 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:21:58 compute-0 podman[216087]: 2026-01-22 22:21:58.166462374 +0000 UTC m=+0.089283709 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 22:21:58 compute-0 nova_compute[182725]: 2026-01-22 22:21:58.381 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:58 compute-0 nova_compute[182725]: 2026-01-22 22:21:58.615 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:21:59 compute-0 ovn_controller[94850]: 2026-01-22T22:21:59Z|00121|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 22:22:01 compute-0 podman[216106]: 2026-01-22 22:22:01.150721972 +0000 UTC m=+0.066383842 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 22:22:01 compute-0 podman[216105]: 2026-01-22 22:22:01.174128641 +0000 UTC m=+0.092763214 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 22 22:22:03 compute-0 nova_compute[182725]: 2026-01-22 22:22:03.385 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:03 compute-0 nova_compute[182725]: 2026-01-22 22:22:03.619 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:06 compute-0 nova_compute[182725]: 2026-01-22 22:22:06.685 182729 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 22:22:08 compute-0 nova_compute[182725]: 2026-01-22 22:22:08.388 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:08 compute-0 nova_compute[182725]: 2026-01-22 22:22:08.621 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:08 compute-0 nova_compute[182725]: 2026-01-22 22:22:08.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:08 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 22 22:22:08 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000027.scope: Consumed 12.248s CPU time.
Jan 22 22:22:08 compute-0 systemd-machined[154006]: Machine qemu-17-instance-00000027 terminated.
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.107 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'name': 'tempest-MigrationsAdminTest-server-1393496925', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000025', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'hostId': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.109 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ce913c81-c8b7-4b71-91b0-ec941d59dc1c', 'name': 'tempest-MigrationsAdminTest-server-1151892985', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000027', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'hostId': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.133 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.read.bytes volume: 32016384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.134 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.136 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20ec0dc9-af22-49fb-8af1-0f0914af6874', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32016384, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.109970', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca8b3e96-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '4b894cd5bf56aabbc4d700621644ecab5006cca6fd0d56bd3574ef95be4f87a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.109970', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca8b4e36-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '871dcc6a1dcb820fec8bfaf19a92e958e037ddad934dba48cecab12470098b70'}]}, 'timestamp': '2026-01-22 22:22:09.136322', '_unique_id': '4be5e11eb5db4a1393f06e2e43ad80dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.138 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.142 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.142 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.write.bytes volume: 249856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.142 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.143 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45ddbf55-f7fd-40db-a6be-5e9078a4cf13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 249856, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.142336', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca8c7f40-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '00efb25ee9a82530ae0caf10b775739f2e2d5c32629218689b76fcd7bfdf5c0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.142336', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca8c8ca6-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': 'fcb12c11f0452923390c6baa8014639d70bca38918679aa8a425808ee3b825a7'}]}, 'timestamp': '2026-01-22 22:22:09.143618', '_unique_id': 'c8b1b9f569c5447c98b49035a51a397d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.144 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.145 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.160 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/memory.usage volume: 40.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.161 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d5b6904-1c02-419c-9621-f24b88988ce8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.6328125, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'timestamp': '2026-01-22T22:22:09.145654', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ca8f51de-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.817797643, 'message_signature': '64f5664190d5983155fdc57eaa4ef5118508a389a95a2b1fcf6de9cf03b7372f'}]}, 'timestamp': '2026-01-22 22:22:09.162132', '_unique_id': '3eaedcf8a5c44ef5b9cb5d08986e3d31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.163 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.175 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.176 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.177 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e8dbb8d-d848-47d0-aafe-43333df54b4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.165592', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca9199da-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.822854198, 'message_signature': '241f258322187f45a452fd4a4cf4c364edf119a951e68feefeeb6c6109d3a2da'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.165592', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca91a54c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.822854198, 'message_signature': '80cd042f5205a956894dce764cf7259499fd284ee22fd959757b5e12318f36fd'}]}, 'timestamp': '2026-01-22 22:22:09.177253', '_unique_id': 'd40ef52713914f3dafe01fe47260abd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.179 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.allocation volume: 30085120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.179 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.180 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e084f1ca-77fb-4e0e-ade5-ff022dc0fd39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30085120, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.179420', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca9225da-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.822854198, 'message_signature': '38dc444d4a33a764da0d19b2a295d41f9414cd5b426e6d5804ab0ac12e5ed963'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.179420', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca922f1c-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.822854198, 'message_signature': '2880cc892d300d23cc681409ea619c6e39a4c61621cf23197d7e97f3acadf8cc'}]}, 'timestamp': '2026-01-22 22:22:09.180596', '_unique_id': '071ddbadf65c49f4810da0e7703bbaa2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.182 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.183 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.183 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.183 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>]
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.183 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.write.requests volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.183 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.184 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c958fc4d-9dcd-4334-a915-d77a56e5c166', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 28, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.183634', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca92c8c8-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': 'ab333e8434f10e6427664a16214977cc073b9eebc999d5947536d407d96ea96f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.183634', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca92d2e6-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '4cb4b5f0217746ca4f2306d05dfb7daa589eff502359352c1a2ca19dfb425c59'}]}, 'timestamp': '2026-01-22 22:22:09.184710', '_unique_id': 'c224e81b7b6b4a48bbcba731a5889df0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.186 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.186 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>]
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.186 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.write.latency volume: 23604582 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.186 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbb44b25-e33e-4d22-afd0-e5f394410168', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23604582, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.186263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca932eda-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '0f872127db195e9b62407fd2a35c1981f2cbb0ecf81ce1d8f851722f712b0a9e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.186263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca9336e6-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': 'd914c31f84da8b6fda433e3acdb0afc0c7470564d302aa4854c7433ce7d8553a'}]}, 'timestamp': '2026-01-22 22:22:09.187218', '_unique_id': '48f5dbc51265494b811bea49b1fa1a5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.187 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.188 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.188 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/cpu volume: 11050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77859dc3-dead-495a-b7b3-e229efe07328', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11050000000, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'timestamp': '2026-01-22T22:22:09.188391', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ca9383da-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.817797643, 'message_signature': '1ee1c1b0105ac0427729ede7130cec4adaebdbced6e64114bb4d7db042c00d7f'}]}, 'timestamp': '2026-01-22 22:22:09.189226', '_unique_id': '8937a9b9f52b41d7a54a503767bcea9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.190 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.191 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.191 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.191 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>]
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.192 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.192 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.193 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.193 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1393496925>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1151892985>]
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.193 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.193 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a154fe9b-c76f-4e7e-a993-0834b7c632f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.193451', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca9447e8-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.822854198, 'message_signature': 'f14cb70ccc2df9583a71178556239f1deab6bb608fce31767d1109edb94dc7a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.193451', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca9450b2-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.822854198, 'message_signature': '386ff93f7e4630833749d5754ad101bc650724bf4eacba894d8b241cbacca170'}]}, 'timestamp': '2026-01-22 22:22:09.194360', '_unique_id': 'd4859ac87fa941e582abfa004d9c2ee1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.194 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.196 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.196 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.197 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.197 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.read.latency volume: 241209835 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.197 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.read.latency volume: 19534718 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.198 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '788f63e6-05cd-45c5-afce-62508c01e706', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 241209835, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.197536', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca94e7b6-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '52f903264f5f8aecb8ed13d803d0b2275c2bb5bc9814365e27c289883a4e5e60'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19534718, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.197536', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca94f0da-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '26dba832a8b89099585084d8e100985d571d9c8e92fa7315781734b386b17faa'}]}, 'timestamp': '2026-01-22 22:22:09.198509', '_unique_id': '09d87c5ee27948d0b6cb8b099bda1f0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.read.requests volume: 1205 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.199 12 DEBUG ceilometer.compute.pollsters [-] 79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.200 12 DEBUG ceilometer.compute.pollsters [-] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000027, id=ce913c81-c8b7-4b71-91b0-ec941d59dc1c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3452b466-3c6e-4546-96f0-82a46192d1a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1205, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-vda', 'timestamp': '2026-01-22T22:22:09.199682', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca953bf8-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': 'd453a3b876e9807b81fede330d32e03589be4f03dcc1de229afcf8efec7d581f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'user_name': None, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'project_name': None, 'resource_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff-sda', 'timestamp': '2026-01-22T22:22:09.199682', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1393496925', 'name': 'instance-00000025', 'instance_id': '79166459-7b8b-44ed-8dba-0ba4cb9d97ff', 'instance_type': 'm1.nano', 'host': '2cdabe269c81f5232f1848be7e0b5b3cc6cecb5eacab26ca77c489d2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca9543f0-f7e0-11f0-9a35-fa163e3d8874', 'monotonic_time': 4166.767231213, 'message_signature': '68fa78efc46b4f58ba7c086e25eb143afaefa3825bf83dcb3ac616060a4b4036'}]}, 'timestamp': '2026-01-22 22:22:09.200587', '_unique_id': 'efb94292fca843d097b6074b00cdd021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:22:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:22:09.201 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.704 182729 INFO nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance shutdown successfully after 13 seconds.
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.710 182729 INFO nova.virt.libvirt.driver [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance destroyed successfully.
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.715 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.779 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.780 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.846 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.848 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Copying file /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk to 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.848 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:22:09 compute-0 nova_compute[182725]: 2026-01-22 22:22:09.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.718 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.719 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.719 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.720 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.751 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "scp -r /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk" returned: 0 in 0.903s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.752 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Copying file /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.752 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk.config 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:10 compute-0 nova_compute[182725]: 2026-01-22 22:22:10.962 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.034 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "scp -C -r /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk.config 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.034 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Copying file /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.035 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk.info 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.240 182729 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "scp -C -r /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_resize/disk.info 192.168.122.102:/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.367 182729 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.368 182729 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.368 182729 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.820 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.842 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.843 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.843 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.844 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.844 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.877 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.878 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.878 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.879 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:22:11 compute-0 nova_compute[182725]: 2026-01-22 22:22:11.965 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:11 compute-0 podman[216191]: 2026-01-22 22:22:11.988481888 +0000 UTC m=+0.057350749 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 22 22:22:12 compute-0 podman[216192]: 2026-01-22 22:22:12.002927325 +0000 UTC m=+0.066698480 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.034 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.035 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.119 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.127 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000027, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.271 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.273 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5490MB free_disk=73.31986618041992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.273 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.273 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.314 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Migration for instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.334 182729 INFO nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating resource usage from migration 90e6c5da-445b-4739-932f-03c7172db2c7
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.335 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Starting to track outgoing migration 90e6c5da-445b-4739-932f-03c7172db2c7 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.390 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 79166459-7b8b-44ed-8dba-0ba4cb9d97ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.390 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Migration 90e6c5da-445b-4739-932f-03c7172db2c7 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.391 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.391 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:22:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:12.430 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:12.431 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:12.432 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.457 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.469 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.486 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.486 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.531 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:12 compute-0 nova_compute[182725]: 2026-01-22 22:22:12.531 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:22:13 compute-0 nova_compute[182725]: 2026-01-22 22:22:13.391 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:13 compute-0 nova_compute[182725]: 2026-01-22 22:22:13.623 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:13 compute-0 nova_compute[182725]: 2026-01-22 22:22:13.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:14 compute-0 nova_compute[182725]: 2026-01-22 22:22:14.056 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:14 compute-0 nova_compute[182725]: 2026-01-22 22:22:14.057 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:14 compute-0 nova_compute[182725]: 2026-01-22 22:22:14.057 182729 DEBUG nova.compute.manager [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Going to confirm migration 8 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 22:22:14 compute-0 nova_compute[182725]: 2026-01-22 22:22:14.131 182729 DEBUG nova.objects.instance [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'info_cache' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:22:14 compute-0 nova_compute[182725]: 2026-01-22 22:22:14.849 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:22:14 compute-0 nova_compute[182725]: 2026-01-22 22:22:14.849 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:22:14 compute-0 nova_compute[182725]: 2026-01-22 22:22:14.850 182729 DEBUG nova.network.neutron [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:22:15 compute-0 nova_compute[182725]: 2026-01-22 22:22:15.104 182729 DEBUG nova.network.neutron [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:22:15 compute-0 podman[216239]: 2026-01-22 22:22:15.161607694 +0000 UTC m=+0.090144040 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:22:15 compute-0 nova_compute[182725]: 2026-01-22 22:22:15.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.072 182729 DEBUG nova.network.neutron [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.097 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.098 182729 DEBUG nova.objects.instance [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'migration_context' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.129 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.130 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.237 182729 DEBUG nova.compute.provider_tree [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.253 182729 DEBUG nova.scheduler.client.report [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.289 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.424 182729 INFO nova.scheduler.client.report [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Deleted allocation for migration 90e6c5da-445b-4739-932f-03c7172db2c7
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.498 182729 DEBUG oslo_concurrency.lockutils [None req-3f6ebc64-9733-42a1-8b0e-267c4d9a7d2d 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:16 compute-0 nova_compute[182725]: 2026-01-22 22:22:16.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:22:18 compute-0 nova_compute[182725]: 2026-01-22 22:22:18.395 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:18 compute-0 nova_compute[182725]: 2026-01-22 22:22:18.625 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:23 compute-0 nova_compute[182725]: 2026-01-22 22:22:23.397 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:23 compute-0 nova_compute[182725]: 2026-01-22 22:22:23.629 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:24 compute-0 nova_compute[182725]: 2026-01-22 22:22:24.104 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120529.1032856, ce913c81-c8b7-4b71-91b0-ec941d59dc1c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:22:24 compute-0 nova_compute[182725]: 2026-01-22 22:22:24.105 182729 INFO nova.compute.manager [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] VM Stopped (Lifecycle Event)
Jan 22 22:22:24 compute-0 nova_compute[182725]: 2026-01-22 22:22:24.132 182729 DEBUG nova.compute.manager [None req-02b9cb8e-bc2d-4fd6-8a46-740a03f60fdd - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.536 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.537 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.552 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.661 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.662 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.672 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.673 182729 INFO nova.compute.claims [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.800 182729 DEBUG nova.compute.provider_tree [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.813 182729 DEBUG nova.scheduler.client.report [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.831 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.832 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.901 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.902 182729 DEBUG nova.network.neutron [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.926 182729 INFO nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:22:26 compute-0 nova_compute[182725]: 2026-01-22 22:22:26.956 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.075 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.077 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.078 182729 INFO nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Creating image(s)
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.078 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "/var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.079 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "/var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.080 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "/var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.099 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.123 182729 DEBUG nova.policy [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7df5d821a5ca4c08abc23a1f0c71403a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8612432d2f8442ee9d0e924d1da859d7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.167 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.168 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.169 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.184 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.253 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.254 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.296 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.297 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.298 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.363 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.364 182729 DEBUG nova.virt.disk.api [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Checking if we can resize image /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.365 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.425 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.427 182729 DEBUG nova.virt.disk.api [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Cannot resize image /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.427 182729 DEBUG nova.objects.instance [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lazy-loading 'migration_context' on Instance uuid d7e18cec-eb96-4031-a96e-e8ebfe11d7f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.450 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.451 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Ensure instance console log exists: /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.452 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.452 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:27 compute-0 nova_compute[182725]: 2026-01-22 22:22:27.453 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:28 compute-0 nova_compute[182725]: 2026-01-22 22:22:28.232 182729 DEBUG nova.network.neutron [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Successfully created port: 149448f0-13a7-436b-b700-ada10405a161 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:22:28 compute-0 nova_compute[182725]: 2026-01-22 22:22:28.400 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:28 compute-0 nova_compute[182725]: 2026-01-22 22:22:28.632 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:29 compute-0 podman[216278]: 2026-01-22 22:22:29.149591977 +0000 UTC m=+0.073984300 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.314 182729 DEBUG nova.network.neutron [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Successfully updated port: 149448f0-13a7-436b-b700-ada10405a161 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.349 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.349 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquired lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.350 182729 DEBUG nova.network.neutron [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.511 182729 DEBUG nova.compute.manager [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-changed-149448f0-13a7-436b-b700-ada10405a161 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.511 182729 DEBUG nova.compute.manager [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Refreshing instance network info cache due to event network-changed-149448f0-13a7-436b-b700-ada10405a161. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.511 182729 DEBUG oslo_concurrency.lockutils [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:22:29 compute-0 nova_compute[182725]: 2026-01-22 22:22:29.527 182729 DEBUG nova.network.neutron [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.582 182729 DEBUG nova.network.neutron [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Updating instance_info_cache with network_info: [{"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.609 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Releasing lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.609 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Instance network_info: |[{"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.610 182729 DEBUG oslo_concurrency.lockutils [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.610 182729 DEBUG nova.network.neutron [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Refreshing network info cache for port 149448f0-13a7-436b-b700-ada10405a161 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.616 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Start _get_guest_xml network_info=[{"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.623 182729 WARNING nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.638 182729 DEBUG nova.virt.libvirt.host [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.639 182729 DEBUG nova.virt.libvirt.host [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.647 182729 DEBUG nova.virt.libvirt.host [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.647 182729 DEBUG nova.virt.libvirt.host [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.650 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.650 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.651 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.651 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.652 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.652 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.652 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.653 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.653 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.653 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.654 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.654 182729 DEBUG nova.virt.hardware [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.660 182729 DEBUG nova.virt.libvirt.vif [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:22:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=42,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXm2Sl4t9UMfiHNJWOITRNO0WQkLl/0gmoEZQLwptcYmG0W32tstvPhyqNMV3ZeC1It4ACWm/w2fPxk1aMvnTtBLxMmlV1UCKTEvTVfhX+2VhG8ORzpXFbxo4WEaDP4Sg==',key_name='tempest-keypair-875523264',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8612432d2f8442ee9d0e924d1da859d7',ramdisk_id='',reservation_id='r-wk3encuz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-2006574538',owner_user_name='tempest-ServersTestFqdnHostnames-2006574538-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:22:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7df5d821a5ca4c08abc23a1f0c71403a',uuid=d7e18cec-eb96-4031-a96e-e8ebfe11d7f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.661 182729 DEBUG nova.network.os_vif_util [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Converting VIF {"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.662 182729 DEBUG nova.network.os_vif_util [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:45:5e,bridge_name='br-int',has_traffic_filtering=True,id=149448f0-13a7-436b-b700-ada10405a161,network=Network(333f36c3-573e-4ed3-8a05-3f051147e707),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap149448f0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.663 182729 DEBUG nova.objects.instance [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lazy-loading 'pci_devices' on Instance uuid d7e18cec-eb96-4031-a96e-e8ebfe11d7f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.679 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <uuid>d7e18cec-eb96-4031-a96e-e8ebfe11d7f4</uuid>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <name>instance-0000002a</name>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <nova:name>guest-instance-1.domain.com</nova:name>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:22:30</nova:creationTime>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:user uuid="7df5d821a5ca4c08abc23a1f0c71403a">tempest-ServersTestFqdnHostnames-2006574538-project-member</nova:user>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:project uuid="8612432d2f8442ee9d0e924d1da859d7">tempest-ServersTestFqdnHostnames-2006574538</nova:project>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         <nova:port uuid="149448f0-13a7-436b-b700-ada10405a161">
Jan 22 22:22:30 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <system>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <entry name="serial">d7e18cec-eb96-4031-a96e-e8ebfe11d7f4</entry>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <entry name="uuid">d7e18cec-eb96-4031-a96e-e8ebfe11d7f4</entry>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </system>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <os>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   </os>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <features>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   </features>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk.config"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:dd:45:5e"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <target dev="tap149448f0-13"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/console.log" append="off"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <video>
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </video>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:22:30 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:22:30 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:22:30 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:22:30 compute-0 nova_compute[182725]: </domain>
Jan 22 22:22:30 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.680 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Preparing to wait for external event network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.680 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.681 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.681 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.682 182729 DEBUG nova.virt.libvirt.vif [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:22:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=42,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXm2Sl4t9UMfiHNJWOITRNO0WQkLl/0gmoEZQLwptcYmG0W32tstvPhyqNMV3ZeC1It4ACWm/w2fPxk1aMvnTtBLxMmlV1UCKTEvTVfhX+2VhG8ORzpXFbxo4WEaDP4Sg==',key_name='tempest-keypair-875523264',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8612432d2f8442ee9d0e924d1da859d7',ramdisk_id='',reservation_id='r-wk3encuz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-2006574538',owner_user_name='tempest-ServersTestFqdnHostnames-2006574538-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:22:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7df5d821a5ca4c08abc23a1f0c71403a',uuid=d7e18cec-eb96-4031-a96e-e8ebfe11d7f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.683 182729 DEBUG nova.network.os_vif_util [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Converting VIF {"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.684 182729 DEBUG nova.network.os_vif_util [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:45:5e,bridge_name='br-int',has_traffic_filtering=True,id=149448f0-13a7-436b-b700-ada10405a161,network=Network(333f36c3-573e-4ed3-8a05-3f051147e707),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap149448f0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.684 182729 DEBUG os_vif [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:45:5e,bridge_name='br-int',has_traffic_filtering=True,id=149448f0-13a7-436b-b700-ada10405a161,network=Network(333f36c3-573e-4ed3-8a05-3f051147e707),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap149448f0-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.685 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.686 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.686 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.691 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.692 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap149448f0-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.693 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap149448f0-13, col_values=(('external_ids', {'iface-id': '149448f0-13a7-436b-b700-ada10405a161', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:45:5e', 'vm-uuid': 'd7e18cec-eb96-4031-a96e-e8ebfe11d7f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.695 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:30 compute-0 NetworkManager[54954]: <info>  [1769120550.6970] manager: (tap149448f0-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.698 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.704 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.705 182729 INFO os_vif [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:45:5e,bridge_name='br-int',has_traffic_filtering=True,id=149448f0-13a7-436b-b700-ada10405a161,network=Network(333f36c3-573e-4ed3-8a05-3f051147e707),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap149448f0-13')
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.756 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.756 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.757 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] No VIF found with MAC fa:16:3e:dd:45:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:22:30 compute-0 nova_compute[182725]: 2026-01-22 22:22:30.757 182729 INFO nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Using config drive
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.130 182729 INFO nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Creating config drive at /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk.config
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.140 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps36j_5bs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.274 182729 DEBUG oslo_concurrency.processutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps36j_5bs" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:22:31 compute-0 kernel: tap149448f0-13: entered promiscuous mode
Jan 22 22:22:31 compute-0 ovn_controller[94850]: 2026-01-22T22:22:31Z|00122|binding|INFO|Claiming lport 149448f0-13a7-436b-b700-ada10405a161 for this chassis.
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.395 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 ovn_controller[94850]: 2026-01-22T22:22:31Z|00123|binding|INFO|149448f0-13a7-436b-b700-ada10405a161: Claiming fa:16:3e:dd:45:5e 10.100.0.5
Jan 22 22:22:31 compute-0 NetworkManager[54954]: <info>  [1769120551.3978] manager: (tap149448f0-13): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.406 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.424 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:45:5e 10.100.0.5'], port_security=['fa:16:3e:dd:45:5e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd7e18cec-eb96-4031-a96e-e8ebfe11d7f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-333f36c3-573e-4ed3-8a05-3f051147e707', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8612432d2f8442ee9d0e924d1da859d7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '90e37ac1-e6af-477a-8819-6c97c17632f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61cd2af8-2c5e-4786-a783-41f32dde76e7, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=149448f0-13a7-436b-b700-ada10405a161) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.425 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 149448f0-13a7-436b-b700-ada10405a161 in datapath 333f36c3-573e-4ed3-8a05-3f051147e707 bound to our chassis
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.426 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 333f36c3-573e-4ed3-8a05-3f051147e707
Jan 22 22:22:31 compute-0 systemd-udevd[216340]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.444 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7511caa0-08a9-41a4-9fa6-76f0f5f6b976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.445 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap333f36c3-51 in ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.449 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap333f36c3-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.449 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca6b166-e7f3-4913-b92a-65f143fd60e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.450 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d05284-e85e-413d-85dd-31ed902640ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 systemd-machined[154006]: New machine qemu-18-instance-0000002a.
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.462 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.464 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 ovn_controller[94850]: 2026-01-22T22:22:31Z|00124|binding|INFO|Setting lport 149448f0-13a7-436b-b700-ada10405a161 ovn-installed in OVS
Jan 22 22:22:31 compute-0 ovn_controller[94850]: 2026-01-22T22:22:31Z|00125|binding|INFO|Setting lport 149448f0-13a7-436b-b700-ada10405a161 up in Southbound
Jan 22 22:22:31 compute-0 NetworkManager[54954]: <info>  [1769120551.4683] device (tap149448f0-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:22:31 compute-0 NetworkManager[54954]: <info>  [1769120551.4690] device (tap149448f0-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.469 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[1595ec9e-abf9-4b86-b8a1-9a82fbf329d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000002a.
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.489 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[06eb9e5b-5591-4011-b04c-4748ddc38c79]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 podman[216311]: 2026-01-22 22:22:31.504463215 +0000 UTC m=+0.126307884 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.532 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c4954560-8449-4a8a-8d35-2ed24218469b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 podman[216309]: 2026-01-22 22:22:31.536839335 +0000 UTC m=+0.162966950 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 22:22:31 compute-0 NetworkManager[54954]: <info>  [1769120551.5404] manager: (tap333f36c3-50): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Jan 22 22:22:31 compute-0 systemd-udevd[216353]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.540 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3a2e6b10-066d-4001-b078-8881cf6fbfdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.587 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[34737cbb-1df1-4b13-9afd-a99293160708]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.592 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9d714f23-99ca-4fd6-8be7-2ab01e910c0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 NetworkManager[54954]: <info>  [1769120551.6236] device (tap333f36c3-50): carrier: link connected
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.630 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[44e264e8-30fb-44d4-b5a1-d46094f36f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.651 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a5525e2e-5004-44de-aa31-faa53adc8a04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap333f36c3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:b7:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418922, 'reachable_time': 34076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216394, 'error': None, 'target': 'ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.676 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a781c3-c130-4f1d-8208-ad84a2069c9f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:b7a0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418922, 'tstamp': 418922}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216397, 'error': None, 'target': 'ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.698 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[791d6f32-2f86-4f5c-ba30-0c7ea3a5e570]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap333f36c3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:b7:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418922, 'reachable_time': 34076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216401, 'error': None, 'target': 'ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.738 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1a523c21-c3ee-47ef-a8f9-fe10a8676907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.768 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120551.7663, d7e18cec-eb96-4031-a96e-e8ebfe11d7f4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.768 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] VM Started (Lifecycle Event)
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.803 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[132d7378-99b1-426e-9603-a5ce55745d7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.805 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap333f36c3-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.805 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.806 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap333f36c3-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:31 compute-0 NetworkManager[54954]: <info>  [1769120551.8091] manager: (tap333f36c3-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 22 22:22:31 compute-0 kernel: tap333f36c3-50: entered promiscuous mode
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.808 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.811 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap333f36c3-50, col_values=(('external_ids', {'iface-id': '9387c3ee-5767-48df-b9c7-825c8f92856f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.813 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 ovn_controller[94850]: 2026-01-22T22:22:31Z|00126|binding|INFO|Releasing lport 9387c3ee-5767-48df-b9c7-825c8f92856f from this chassis (sb_readonly=0)
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.814 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.815 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/333f36c3-573e-4ed3-8a05-3f051147e707.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/333f36c3-573e-4ed3-8a05-3f051147e707.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.816 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8b825d46-5eba-4807-ae79-808285b262ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.818 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-333f36c3-573e-4ed3-8a05-3f051147e707
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/333f36c3-573e-4ed3-8a05-3f051147e707.pid.haproxy
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 333f36c3-573e-4ed3-8a05-3f051147e707
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:22:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:31.819 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707', 'env', 'PROCESS_TAG=haproxy-333f36c3-573e-4ed3-8a05-3f051147e707', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/333f36c3-573e-4ed3-8a05-3f051147e707.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.825 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.971 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.983 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120551.773471, d7e18cec-eb96-4031-a96e-e8ebfe11d7f4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:22:31 compute-0 nova_compute[182725]: 2026-01-22 22:22:31.983 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] VM Paused (Lifecycle Event)
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.047 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.051 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:22:32 compute-0 podman[216434]: 2026-01-22 22:22:32.185960073 +0000 UTC m=+0.057662836 container create a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 22:22:32 compute-0 systemd[1]: Started libpod-conmon-a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58.scope.
Jan 22 22:22:32 compute-0 podman[216434]: 2026-01-22 22:22:32.151370198 +0000 UTC m=+0.023073011 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:22:32 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.263 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:22:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/712536f78d4f6f7c3a6e86fceda6d500036913f7e08a46588291351fbcac8418/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:22:32 compute-0 podman[216434]: 2026-01-22 22:22:32.279004984 +0000 UTC m=+0.150707767 container init a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:22:32 compute-0 podman[216434]: 2026-01-22 22:22:32.285299149 +0000 UTC m=+0.157001912 container start a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:22:32 compute-0 neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707[216449]: [NOTICE]   (216453) : New worker (216455) forked
Jan 22 22:22:32 compute-0 neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707[216449]: [NOTICE]   (216453) : Loading success.
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.430 182729 DEBUG nova.compute.manager [req-cbbf0898-f162-46f2-afea-fa3c87566dd3 req-79cecf63-6346-4160-a96e-11aa71cb2218 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.431 182729 DEBUG oslo_concurrency.lockutils [req-cbbf0898-f162-46f2-afea-fa3c87566dd3 req-79cecf63-6346-4160-a96e-11aa71cb2218 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.431 182729 DEBUG oslo_concurrency.lockutils [req-cbbf0898-f162-46f2-afea-fa3c87566dd3 req-79cecf63-6346-4160-a96e-11aa71cb2218 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.431 182729 DEBUG oslo_concurrency.lockutils [req-cbbf0898-f162-46f2-afea-fa3c87566dd3 req-79cecf63-6346-4160-a96e-11aa71cb2218 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.432 182729 DEBUG nova.compute.manager [req-cbbf0898-f162-46f2-afea-fa3c87566dd3 req-79cecf63-6346-4160-a96e-11aa71cb2218 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Processing event network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.432 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.437 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120552.4372187, d7e18cec-eb96-4031-a96e-e8ebfe11d7f4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.437 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] VM Resumed (Lifecycle Event)
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.439 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.443 182729 INFO nova.virt.libvirt.driver [-] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Instance spawned successfully.
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.444 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.470 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.471 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.471 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.472 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.472 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.472 182729 DEBUG nova.virt.libvirt.driver [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.481 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.488 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.526 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.593 182729 INFO nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Took 5.52 seconds to spawn the instance on the hypervisor.
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.594 182729 DEBUG nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.694 182729 DEBUG nova.network.neutron [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Updated VIF entry in instance network info cache for port 149448f0-13a7-436b-b700-ada10405a161. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.694 182729 DEBUG nova.network.neutron [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Updating instance_info_cache with network_info: [{"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.696 182729 INFO nova.compute.manager [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Took 6.08 seconds to build instance.
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.734 182729 DEBUG oslo_concurrency.lockutils [req-d8fb138d-fbd6-4919-9795-b8a805d7d99e req-4fc141d1-181f-4119-b1e0-4c194b017e0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:22:32 compute-0 nova_compute[182725]: 2026-01-22 22:22:32.737 182729 DEBUG oslo_concurrency.lockutils [None req-3097e020-5462-4e06-896d-cf21ea24909a 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:33 compute-0 nova_compute[182725]: 2026-01-22 22:22:33.635 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:34 compute-0 nova_compute[182725]: 2026-01-22 22:22:34.675 182729 DEBUG nova.compute.manager [req-5d615e36-e974-4619-9c12-33bb1859eeee req-15eeffff-24b2-45e3-b31a-f5bd1cdf5cfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:22:34 compute-0 nova_compute[182725]: 2026-01-22 22:22:34.676 182729 DEBUG oslo_concurrency.lockutils [req-5d615e36-e974-4619-9c12-33bb1859eeee req-15eeffff-24b2-45e3-b31a-f5bd1cdf5cfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:34 compute-0 nova_compute[182725]: 2026-01-22 22:22:34.677 182729 DEBUG oslo_concurrency.lockutils [req-5d615e36-e974-4619-9c12-33bb1859eeee req-15eeffff-24b2-45e3-b31a-f5bd1cdf5cfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:34 compute-0 nova_compute[182725]: 2026-01-22 22:22:34.677 182729 DEBUG oslo_concurrency.lockutils [req-5d615e36-e974-4619-9c12-33bb1859eeee req-15eeffff-24b2-45e3-b31a-f5bd1cdf5cfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:34 compute-0 nova_compute[182725]: 2026-01-22 22:22:34.677 182729 DEBUG nova.compute.manager [req-5d615e36-e974-4619-9c12-33bb1859eeee req-15eeffff-24b2-45e3-b31a-f5bd1cdf5cfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] No waiting events found dispatching network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:22:34 compute-0 nova_compute[182725]: 2026-01-22 22:22:34.678 182729 WARNING nova.compute.manager [req-5d615e36-e974-4619-9c12-33bb1859eeee req-15eeffff-24b2-45e3-b31a-f5bd1cdf5cfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received unexpected event network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 for instance with vm_state active and task_state None.
Jan 22 22:22:35 compute-0 nova_compute[182725]: 2026-01-22 22:22:35.695 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:35 compute-0 nova_compute[182725]: 2026-01-22 22:22:35.708 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:35.709 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:22:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:35.710 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:22:36 compute-0 nova_compute[182725]: 2026-01-22 22:22:36.851 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:36 compute-0 NetworkManager[54954]: <info>  [1769120556.8530] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 22 22:22:36 compute-0 NetworkManager[54954]: <info>  [1769120556.8540] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 22 22:22:36 compute-0 nova_compute[182725]: 2026-01-22 22:22:36.951 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:36 compute-0 ovn_controller[94850]: 2026-01-22T22:22:36Z|00127|binding|INFO|Releasing lport 9387c3ee-5767-48df-b9c7-825c8f92856f from this chassis (sb_readonly=0)
Jan 22 22:22:36 compute-0 nova_compute[182725]: 2026-01-22 22:22:36.969 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:37 compute-0 nova_compute[182725]: 2026-01-22 22:22:37.257 182729 DEBUG nova.compute.manager [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-changed-149448f0-13a7-436b-b700-ada10405a161 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:22:37 compute-0 nova_compute[182725]: 2026-01-22 22:22:37.258 182729 DEBUG nova.compute.manager [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Refreshing instance network info cache due to event network-changed-149448f0-13a7-436b-b700-ada10405a161. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:22:37 compute-0 nova_compute[182725]: 2026-01-22 22:22:37.258 182729 DEBUG oslo_concurrency.lockutils [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:22:37 compute-0 nova_compute[182725]: 2026-01-22 22:22:37.259 182729 DEBUG oslo_concurrency.lockutils [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:22:37 compute-0 nova_compute[182725]: 2026-01-22 22:22:37.260 182729 DEBUG nova.network.neutron [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Refreshing network info cache for port 149448f0-13a7-436b-b700-ada10405a161 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:22:38 compute-0 nova_compute[182725]: 2026-01-22 22:22:38.637 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:40 compute-0 nova_compute[182725]: 2026-01-22 22:22:40.014 182729 DEBUG nova.network.neutron [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Updated VIF entry in instance network info cache for port 149448f0-13a7-436b-b700-ada10405a161. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:22:40 compute-0 nova_compute[182725]: 2026-01-22 22:22:40.016 182729 DEBUG nova.network.neutron [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Updating instance_info_cache with network_info: [{"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:22:40 compute-0 nova_compute[182725]: 2026-01-22 22:22:40.087 182729 DEBUG oslo_concurrency.lockutils [req-3ac87d02-f4e6-4c9a-ab77-4c011b3130f4 req-5d23f1bf-d703-4988-9aac-11e3490a4788 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:22:41 compute-0 nova_compute[182725]: 2026-01-22 22:22:41.753 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:42 compute-0 podman[216465]: 2026-01-22 22:22:42.136613337 +0000 UTC m=+0.066098225 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 22:22:42 compute-0 podman[216466]: 2026-01-22 22:22:42.176029702 +0000 UTC m=+0.100838744 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:22:43 compute-0 nova_compute[182725]: 2026-01-22 22:22:43.640 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:45.713 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:45 compute-0 ovn_controller[94850]: 2026-01-22T22:22:45Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:45:5e 10.100.0.5
Jan 22 22:22:45 compute-0 ovn_controller[94850]: 2026-01-22T22:22:45Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:45:5e 10.100.0.5
Jan 22 22:22:46 compute-0 podman[216527]: 2026-01-22 22:22:46.140177205 +0000 UTC m=+0.060336303 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:22:46 compute-0 nova_compute[182725]: 2026-01-22 22:22:46.757 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:48 compute-0 nova_compute[182725]: 2026-01-22 22:22:48.641 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:51 compute-0 nova_compute[182725]: 2026-01-22 22:22:51.760 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.618 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.619 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.619 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.620 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.620 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.633 182729 INFO nova.compute.manager [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Terminating instance
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.646 182729 DEBUG nova.compute.manager [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:22:52 compute-0 kernel: tap149448f0-13 (unregistering): left promiscuous mode
Jan 22 22:22:52 compute-0 NetworkManager[54954]: <info>  [1769120572.6705] device (tap149448f0-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:22:52 compute-0 ovn_controller[94850]: 2026-01-22T22:22:52Z|00128|binding|INFO|Releasing lport 149448f0-13a7-436b-b700-ada10405a161 from this chassis (sb_readonly=0)
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.676 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 ovn_controller[94850]: 2026-01-22T22:22:52Z|00129|binding|INFO|Setting lport 149448f0-13a7-436b-b700-ada10405a161 down in Southbound
Jan 22 22:22:52 compute-0 ovn_controller[94850]: 2026-01-22T22:22:52Z|00130|binding|INFO|Removing iface tap149448f0-13 ovn-installed in OVS
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.680 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.690 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:45:5e 10.100.0.5'], port_security=['fa:16:3e:dd:45:5e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd7e18cec-eb96-4031-a96e-e8ebfe11d7f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-333f36c3-573e-4ed3-8a05-3f051147e707', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8612432d2f8442ee9d0e924d1da859d7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '90e37ac1-e6af-477a-8819-6c97c17632f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61cd2af8-2c5e-4786-a783-41f32dde76e7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=149448f0-13a7-436b-b700-ada10405a161) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.692 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.693 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 149448f0-13a7-436b-b700-ada10405a161 in datapath 333f36c3-573e-4ed3-8a05-3f051147e707 unbound from our chassis
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.695 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 333f36c3-573e-4ed3-8a05-3f051147e707, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.696 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d5258236-d9b7-4715-aeea-0f7cbf8991e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.697 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707 namespace which is not needed anymore
Jan 22 22:22:52 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Jan 22 22:22:52 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002a.scope: Consumed 12.821s CPU time.
Jan 22 22:22:52 compute-0 systemd-machined[154006]: Machine qemu-18-instance-0000002a terminated.
Jan 22 22:22:52 compute-0 neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707[216449]: [NOTICE]   (216453) : haproxy version is 2.8.14-c23fe91
Jan 22 22:22:52 compute-0 neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707[216449]: [NOTICE]   (216453) : path to executable is /usr/sbin/haproxy
Jan 22 22:22:52 compute-0 neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707[216449]: [WARNING]  (216453) : Exiting Master process...
Jan 22 22:22:52 compute-0 neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707[216449]: [ALERT]    (216453) : Current worker (216455) exited with code 143 (Terminated)
Jan 22 22:22:52 compute-0 neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707[216449]: [WARNING]  (216453) : All workers exited. Exiting... (0)
Jan 22 22:22:52 compute-0 systemd[1]: libpod-a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58.scope: Deactivated successfully.
Jan 22 22:22:52 compute-0 podman[216576]: 2026-01-22 22:22:52.834933046 +0000 UTC m=+0.042217365 container died a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:22:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58-userdata-shm.mount: Deactivated successfully.
Jan 22 22:22:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-712536f78d4f6f7c3a6e86fceda6d500036913f7e08a46588291351fbcac8418-merged.mount: Deactivated successfully.
Jan 22 22:22:52 compute-0 podman[216576]: 2026-01-22 22:22:52.878102743 +0000 UTC m=+0.085387042 container cleanup a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 22:22:52 compute-0 systemd[1]: libpod-conmon-a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58.scope: Deactivated successfully.
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.924 182729 INFO nova.virt.libvirt.driver [-] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Instance destroyed successfully.
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.926 182729 DEBUG nova.objects.instance [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lazy-loading 'resources' on Instance uuid d7e18cec-eb96-4031-a96e-e8ebfe11d7f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.943 182729 DEBUG nova.virt.libvirt.vif [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:22:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=42,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXm2Sl4t9UMfiHNJWOITRNO0WQkLl/0gmoEZQLwptcYmG0W32tstvPhyqNMV3ZeC1It4ACWm/w2fPxk1aMvnTtBLxMmlV1UCKTEvTVfhX+2VhG8ORzpXFbxo4WEaDP4Sg==',key_name='tempest-keypair-875523264',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:22:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8612432d2f8442ee9d0e924d1da859d7',ramdisk_id='',reservation_id='r-wk3encuz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-2006574538',owner_user_name='tempest-ServersTestFqdnHostnames-2006574538-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:22:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7df5d821a5ca4c08abc23a1f0c71403a',uuid=d7e18cec-eb96-4031-a96e-e8ebfe11d7f4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.944 182729 DEBUG nova.network.os_vif_util [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Converting VIF {"id": "149448f0-13a7-436b-b700-ada10405a161", "address": "fa:16:3e:dd:45:5e", "network": {"id": "333f36c3-573e-4ed3-8a05-3f051147e707", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1039968212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8612432d2f8442ee9d0e924d1da859d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap149448f0-13", "ovs_interfaceid": "149448f0-13a7-436b-b700-ada10405a161", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.945 182729 DEBUG nova.network.os_vif_util [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:45:5e,bridge_name='br-int',has_traffic_filtering=True,id=149448f0-13a7-436b-b700-ada10405a161,network=Network(333f36c3-573e-4ed3-8a05-3f051147e707),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap149448f0-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.946 182729 DEBUG os_vif [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:45:5e,bridge_name='br-int',has_traffic_filtering=True,id=149448f0-13a7-436b-b700-ada10405a161,network=Network(333f36c3-573e-4ed3-8a05-3f051147e707),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap149448f0-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.948 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.949 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap149448f0-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.950 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 podman[216610]: 2026-01-22 22:22:52.95119042 +0000 UTC m=+0.047532166 container remove a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.952 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.952 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.956 182729 INFO os_vif [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:45:5e,bridge_name='br-int',has_traffic_filtering=True,id=149448f0-13a7-436b-b700-ada10405a161,network=Network(333f36c3-573e-4ed3-8a05-3f051147e707),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap149448f0-13')
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.956 182729 INFO nova.virt.libvirt.driver [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Deleting instance files /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4_del
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.956 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5676e730-dff2-47de-93f6-5c7c8380e41c]: (4, ('Thu Jan 22 10:22:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707 (a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58)\na4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58\nThu Jan 22 10:22:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707 (a4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58)\na4ec2b8d9e2e46565d63368d3ec9358f3bf2cfa256ec3db287cafee3ab4ced58\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.957 182729 INFO nova.virt.libvirt.driver [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Deletion of /var/lib/nova/instances/d7e18cec-eb96-4031-a96e-e8ebfe11d7f4_del complete
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.959 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cf3d6757-772e-4f70-b1e2-dea866f1cbdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.960 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap333f36c3-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.962 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 kernel: tap333f36c3-50: left promiscuous mode
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.988 182729 DEBUG nova.compute.manager [req-1601076b-5dbe-441f-a3fb-7601499d24c7 req-5306b00d-746e-4c6d-bfb1-4fd608049a8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-vif-unplugged-149448f0-13a7-436b-b700-ada10405a161 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.988 182729 DEBUG oslo_concurrency.lockutils [req-1601076b-5dbe-441f-a3fb-7601499d24c7 req-5306b00d-746e-4c6d-bfb1-4fd608049a8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.989 182729 DEBUG oslo_concurrency.lockutils [req-1601076b-5dbe-441f-a3fb-7601499d24c7 req-5306b00d-746e-4c6d-bfb1-4fd608049a8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.989 182729 DEBUG oslo_concurrency.lockutils [req-1601076b-5dbe-441f-a3fb-7601499d24c7 req-5306b00d-746e-4c6d-bfb1-4fd608049a8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.989 182729 DEBUG nova.compute.manager [req-1601076b-5dbe-441f-a3fb-7601499d24c7 req-5306b00d-746e-4c6d-bfb1-4fd608049a8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] No waiting events found dispatching network-vif-unplugged-149448f0-13a7-436b-b700-ada10405a161 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.989 182729 DEBUG nova.compute.manager [req-1601076b-5dbe-441f-a3fb-7601499d24c7 req-5306b00d-746e-4c6d-bfb1-4fd608049a8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-vif-unplugged-149448f0-13a7-436b-b700-ada10405a161 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:22:52 compute-0 nova_compute[182725]: 2026-01-22 22:22:52.989 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:52.990 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b019cb-d0ed-419f-8b74-8a6ec1e91300]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:53.008 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[352550f4-378c-431b-be72-e149e26ce6ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:53.009 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aee103ee-39db-41a0-b07d-4982f0b9346e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:53.028 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1249ee9b-546d-4e39-a469-431d32334ef1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418912, 'reachable_time': 15267, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216636, 'error': None, 'target': 'ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d333f36c3\x2d573e\x2d4ed3\x2d8a05\x2d3f051147e707.mount: Deactivated successfully.
Jan 22 22:22:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:53.034 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-333f36c3-573e-4ed3-8a05-3f051147e707 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:22:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:22:53.035 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[0a3a5f54-1ef5-4c45-a47b-87e45ac76b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:22:53 compute-0 nova_compute[182725]: 2026-01-22 22:22:53.040 182729 INFO nova.compute.manager [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:22:53 compute-0 nova_compute[182725]: 2026-01-22 22:22:53.041 182729 DEBUG oslo.service.loopingcall [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:22:53 compute-0 nova_compute[182725]: 2026-01-22 22:22:53.041 182729 DEBUG nova.compute.manager [-] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:22:53 compute-0 nova_compute[182725]: 2026-01-22 22:22:53.041 182729 DEBUG nova.network.neutron [-] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:22:53 compute-0 nova_compute[182725]: 2026-01-22 22:22:53.643 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.096 182729 DEBUG nova.compute.manager [req-d6e290fc-5483-45a1-8d57-ced062adefde req-0239d272-144e-4d1f-bac8-44cfad43ca93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.096 182729 DEBUG oslo_concurrency.lockutils [req-d6e290fc-5483-45a1-8d57-ced062adefde req-0239d272-144e-4d1f-bac8-44cfad43ca93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.097 182729 DEBUG oslo_concurrency.lockutils [req-d6e290fc-5483-45a1-8d57-ced062adefde req-0239d272-144e-4d1f-bac8-44cfad43ca93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.097 182729 DEBUG oslo_concurrency.lockutils [req-d6e290fc-5483-45a1-8d57-ced062adefde req-0239d272-144e-4d1f-bac8-44cfad43ca93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.097 182729 DEBUG nova.compute.manager [req-d6e290fc-5483-45a1-8d57-ced062adefde req-0239d272-144e-4d1f-bac8-44cfad43ca93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] No waiting events found dispatching network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.098 182729 WARNING nova.compute.manager [req-d6e290fc-5483-45a1-8d57-ced062adefde req-0239d272-144e-4d1f-bac8-44cfad43ca93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received unexpected event network-vif-plugged-149448f0-13a7-436b-b700-ada10405a161 for instance with vm_state active and task_state deleting.
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.304 182729 DEBUG nova.network.neutron [-] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.320 182729 INFO nova.compute.manager [-] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Took 2.28 seconds to deallocate network for instance.
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.403 182729 DEBUG nova.compute.manager [req-85f48832-c477-477f-9990-0b3a172105d6 req-271c73df-6384-4ec3-8f64-bb351688dc70 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Received event network-vif-deleted-149448f0-13a7-436b-b700-ada10405a161 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.422 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.422 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.511 182729 DEBUG nova.compute.provider_tree [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.529 182729 DEBUG nova.scheduler.client.report [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.551 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.575 182729 INFO nova.scheduler.client.report [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Deleted allocations for instance d7e18cec-eb96-4031-a96e-e8ebfe11d7f4
Jan 22 22:22:55 compute-0 nova_compute[182725]: 2026-01-22 22:22:55.650 182729 DEBUG oslo_concurrency.lockutils [None req-379e8f00-0d3a-4c43-a1af-bc2af392cdfc 7df5d821a5ca4c08abc23a1f0c71403a 8612432d2f8442ee9d0e924d1da859d7 - - default default] Lock "d7e18cec-eb96-4031-a96e-e8ebfe11d7f4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:57 compute-0 nova_compute[182725]: 2026-01-22 22:22:57.951 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:58 compute-0 nova_compute[182725]: 2026-01-22 22:22:58.646 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.453 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.454 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.472 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.588 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.589 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.601 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.602 182729 INFO nova.compute.claims [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.755 182729 DEBUG nova.compute.provider_tree [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.767 182729 DEBUG nova.scheduler.client.report [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.798 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.799 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.896 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.896 182729 DEBUG nova.network.neutron [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.918 182729 INFO nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:22:59 compute-0 nova_compute[182725]: 2026-01-22 22:22:59.984 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.103 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.105 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.105 182729 INFO nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Creating image(s)
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.105 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.106 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.106 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.120 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:00 compute-0 podman[216650]: 2026-01-22 22:23:00.159321821 +0000 UTC m=+0.081346522 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.212 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.213 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.214 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.225 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.282 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.283 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.319 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.320 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.321 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.339 182729 DEBUG nova.network.neutron [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.340 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.380 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.382 182729 DEBUG nova.virt.disk.api [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Checking if we can resize image /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.386 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.446 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.448 182729 DEBUG nova.virt.disk.api [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Cannot resize image /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.448 182729 DEBUG nova.objects.instance [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'migration_context' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.464 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.464 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Ensure instance console log exists: /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.465 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.465 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.466 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.468 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.476 182729 WARNING nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.481 182729 DEBUG nova.virt.libvirt.host [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.483 182729 DEBUG nova.virt.libvirt.host [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.487 182729 DEBUG nova.virt.libvirt.host [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.488 182729 DEBUG nova.virt.libvirt.host [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.489 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.490 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.490 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.491 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.491 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.491 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.491 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.492 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.492 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.492 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.493 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.493 182729 DEBUG nova.virt.hardware [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.498 182729 DEBUG nova.objects.instance [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.518 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <uuid>3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</uuid>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <name>instance-0000002d</name>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <nova:name>tempest-MigrationsAdminTest-server-1010543436</nova:name>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:23:00</nova:creationTime>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:23:00 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:23:00 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:23:00 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:23:00 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:23:00 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:23:00 compute-0 nova_compute[182725]:         <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 22:23:00 compute-0 nova_compute[182725]:         <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <system>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <entry name="serial">3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</entry>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <entry name="uuid">3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</entry>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </system>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <os>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   </os>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <features>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   </features>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/console.log" append="off"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <video>
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </video>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:23:00 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:23:00 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:23:00 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:23:00 compute-0 nova_compute[182725]: </domain>
Jan 22 22:23:00 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.578 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.580 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.581 182729 INFO nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Using config drive
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.917 182729 INFO nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Creating config drive at /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config
Jan 22 22:23:00 compute-0 nova_compute[182725]: 2026-01-22 22:23:00.926 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6_gzusdm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.071 182729 DEBUG oslo_concurrency.processutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6_gzusdm" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:01 compute-0 systemd-machined[154006]: New machine qemu-19-instance-0000002d.
Jan 22 22:23:01 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000002d.
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.464 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120581.463662, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.466 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Resumed (Lifecycle Event)
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.471 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.472 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.478 182729 INFO nova.virt.libvirt.driver [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance spawned successfully.
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.478 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.506 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.517 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.522 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.522 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.522 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.523 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.523 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.524 182729 DEBUG nova.virt.libvirt.driver [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.564 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.564 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120581.4658952, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.565 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Started (Lifecycle Event)
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.594 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.600 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.621 182729 INFO nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Took 1.52 seconds to spawn the instance on the hypervisor.
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.622 182729 DEBUG nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.627 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.707 182729 INFO nova.compute.manager [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Took 2.16 seconds to build instance.
Jan 22 22:23:01 compute-0 nova_compute[182725]: 2026-01-22 22:23:01.724 182729 DEBUG oslo_concurrency.lockutils [None req-a1b023a9-00b4-4684-b345-6f98335bd0f7 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:02 compute-0 nova_compute[182725]: 2026-01-22 22:23:02.107 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:02 compute-0 podman[216714]: 2026-01-22 22:23:02.173840295 +0000 UTC m=+0.099092191 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, release=1755695350)
Jan 22 22:23:02 compute-0 nova_compute[182725]: 2026-01-22 22:23:02.270 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:02 compute-0 podman[216713]: 2026-01-22 22:23:02.287257238 +0000 UTC m=+0.211159210 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 22:23:02 compute-0 nova_compute[182725]: 2026-01-22 22:23:02.954 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:03 compute-0 nova_compute[182725]: 2026-01-22 22:23:03.650 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:04 compute-0 nova_compute[182725]: 2026-01-22 22:23:04.295 182729 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:23:04 compute-0 nova_compute[182725]: 2026-01-22 22:23:04.295 182729 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquired lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:23:04 compute-0 nova_compute[182725]: 2026-01-22 22:23:04.296 182729 DEBUG nova.network.neutron [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:23:04 compute-0 nova_compute[182725]: 2026-01-22 22:23:04.471 182729 DEBUG nova.network.neutron [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.059 182729 DEBUG nova.network.neutron [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.073 182729 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Releasing lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.190 182729 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.191 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Creating file /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/51ef6f65edff41de8e0975b9ae814767.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.191 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/51ef6f65edff41de8e0975b9ae814767.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.618 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/51ef6f65edff41de8e0975b9ae814767.tmp" returned: 1 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.620 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/51ef6f65edff41de8e0975b9ae814767.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.620 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Creating directory /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.621 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.847 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:05 compute-0 nova_compute[182725]: 2026-01-22 22:23:05.854 182729 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:23:07 compute-0 nova_compute[182725]: 2026-01-22 22:23:07.921 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120572.919542, d7e18cec-eb96-4031-a96e-e8ebfe11d7f4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:07 compute-0 nova_compute[182725]: 2026-01-22 22:23:07.921 182729 INFO nova.compute.manager [-] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] VM Stopped (Lifecycle Event)
Jan 22 22:23:07 compute-0 nova_compute[182725]: 2026-01-22 22:23:07.956 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:07 compute-0 nova_compute[182725]: 2026-01-22 22:23:07.968 182729 DEBUG nova.compute.manager [None req-e7ddb20e-85a7-4acd-98b2-022e03884b02 - - - - - -] [instance: d7e18cec-eb96-4031-a96e-e8ebfe11d7f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:08 compute-0 nova_compute[182725]: 2026-01-22 22:23:08.653 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:08 compute-0 nova_compute[182725]: 2026-01-22 22:23:08.885 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:10 compute-0 nova_compute[182725]: 2026-01-22 22:23:10.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:10 compute-0 nova_compute[182725]: 2026-01-22 22:23:10.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:23:10 compute-0 nova_compute[182725]: 2026-01-22 22:23:10.988 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902
Jan 22 22:23:10 compute-0 nova_compute[182725]: 2026-01-22 22:23:10.989 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:23:10 compute-0 nova_compute[182725]: 2026-01-22 22:23:10.989 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:11 compute-0 nova_compute[182725]: 2026-01-22 22:23:11.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:11 compute-0 nova_compute[182725]: 2026-01-22 22:23:11.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:11 compute-0 nova_compute[182725]: 2026-01-22 22:23:11.918 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:11 compute-0 nova_compute[182725]: 2026-01-22 22:23:11.918 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:11 compute-0 nova_compute[182725]: 2026-01-22 22:23:11.918 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.042 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.223 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.226 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.334 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.343 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.408 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.410 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:12.431 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:12.431 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:12.431 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.516 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.735 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.738 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5323MB free_disk=73.34774017333984GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.738 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.738 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.835 182729 INFO nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating resource usage from migration 2ed6e609-89da-4417-b04a-071e732e3e04
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.876 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 79166459-7b8b-44ed-8dba-0ba4cb9d97ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.877 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Migration 2ed6e609-89da-4417-b04a-071e732e3e04 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.877 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.877 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.945 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.958 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.968 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.991 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:23:12 compute-0 nova_compute[182725]: 2026-01-22 22:23:12.992 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:13 compute-0 podman[216798]: 2026-01-22 22:23:13.170723502 +0000 UTC m=+0.087509743 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:23:13 compute-0 podman[216799]: 2026-01-22 22:23:13.176426213 +0000 UTC m=+0.090018705 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:23:13 compute-0 nova_compute[182725]: 2026-01-22 22:23:13.653 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:13 compute-0 nova_compute[182725]: 2026-01-22 22:23:13.993 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:13 compute-0 nova_compute[182725]: 2026-01-22 22:23:13.993 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:13 compute-0 nova_compute[182725]: 2026-01-22 22:23:13.994 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:23:15 compute-0 nova_compute[182725]: 2026-01-22 22:23:15.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:15 compute-0 nova_compute[182725]: 2026-01-22 22:23:15.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:15 compute-0 nova_compute[182725]: 2026-01-22 22:23:15.917 182729 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 22:23:16 compute-0 nova_compute[182725]: 2026-01-22 22:23:16.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:17 compute-0 podman[216838]: 2026-01-22 22:23:17.165420949 +0000 UTC m=+0.079378342 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:23:17 compute-0 nova_compute[182725]: 2026-01-22 22:23:17.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:23:17 compute-0 nova_compute[182725]: 2026-01-22 22:23:17.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:18 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 22 22:23:18 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002d.scope: Consumed 12.274s CPU time.
Jan 22 22:23:18 compute-0 systemd-machined[154006]: Machine qemu-19-instance-0000002d terminated.
Jan 22 22:23:18 compute-0 nova_compute[182725]: 2026-01-22 22:23:18.656 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:18 compute-0 nova_compute[182725]: 2026-01-22 22:23:18.935 182729 INFO nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance shutdown successfully after 13 seconds.
Jan 22 22:23:18 compute-0 nova_compute[182725]: 2026-01-22 22:23:18.943 182729 INFO nova.virt.libvirt.driver [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance destroyed successfully.
Jan 22 22:23:18 compute-0 nova_compute[182725]: 2026-01-22 22:23:18.947 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.039 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.041 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.100 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.102 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Copying file /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk to 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.102 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.944 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "scp -r /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk" returned: 0 in 0.842s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.947 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Copying file /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:23:19 compute-0 nova_compute[182725]: 2026-01-22 22:23:19.948 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk.config 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:20 compute-0 nova_compute[182725]: 2026-01-22 22:23:20.194 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "scp -C -r /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk.config 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:20 compute-0 nova_compute[182725]: 2026-01-22 22:23:20.196 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Copying file /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:23:20 compute-0 nova_compute[182725]: 2026-01-22 22:23:20.196 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk.info 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:20 compute-0 nova_compute[182725]: 2026-01-22 22:23:20.466 182729 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "scp -C -r /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_resize/disk.info 192.168.122.102:/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" returned: 0 in 0.270s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:20 compute-0 nova_compute[182725]: 2026-01-22 22:23:20.631 182729 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:20 compute-0 nova_compute[182725]: 2026-01-22 22:23:20.632 182729 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:20 compute-0 nova_compute[182725]: 2026-01-22 22:23:20.632 182729 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:22 compute-0 nova_compute[182725]: 2026-01-22 22:23:22.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:23 compute-0 nova_compute[182725]: 2026-01-22 22:23:23.659 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:26 compute-0 nova_compute[182725]: 2026-01-22 22:23:26.710 182729 INFO nova.compute.manager [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Swapping old allocation on dict_keys(['4f7db789-7f4b-4901-9c88-ecf66d0aff43']) held by migration 2ed6e609-89da-4417-b04a-071e732e3e04 for instance
Jan 22 22:23:26 compute-0 nova_compute[182725]: 2026-01-22 22:23:26.747 182729 DEBUG nova.scheduler.client.report [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Overwriting current allocation {'allocations': {'8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 36}}, 'project_id': 'e5385c77364a4925bcdfff2bd744eb0b', 'user_id': '8ca7b75a121d4858bc8d282f0c6728e0', 'consumer_generation': 1} on consumer 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Jan 22 22:23:26 compute-0 nova_compute[182725]: 2026-01-22 22:23:26.967 182729 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:23:26 compute-0 nova_compute[182725]: 2026-01-22 22:23:26.968 182729 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:23:26 compute-0 nova_compute[182725]: 2026-01-22 22:23:26.969 182729 DEBUG nova.network.neutron [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.626 182729 DEBUG nova.network.neutron [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.951 182729 DEBUG nova.network.neutron [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.964 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.967 182729 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.968 182729 DEBUG nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.975 182729 DEBUG nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.979 182729 WARNING nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.983 182729 DEBUG nova.virt.libvirt.host [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.984 182729 DEBUG nova.virt.libvirt.host [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.988 182729 DEBUG nova.virt.libvirt.host [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.988 182729 DEBUG nova.virt.libvirt.host [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.989 182729 DEBUG nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.989 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.990 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.990 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.990 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.990 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.991 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.991 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.991 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.991 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.991 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.992 182729 DEBUG nova.virt.hardware [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:23:27 compute-0 nova_compute[182725]: 2026-01-22 22:23:27.992 182729 DEBUG nova.objects.instance [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:23:28 compute-0 nova_compute[182725]: 2026-01-22 22:23:28.008 182729 DEBUG oslo_concurrency.processutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:28 compute-0 nova_compute[182725]: 2026-01-22 22:23:28.100 182729 DEBUG oslo_concurrency.processutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:28 compute-0 nova_compute[182725]: 2026-01-22 22:23:28.103 182729 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:28 compute-0 nova_compute[182725]: 2026-01-22 22:23:28.104 182729 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:28 compute-0 nova_compute[182725]: 2026-01-22 22:23:28.105 182729 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:28 compute-0 nova_compute[182725]: 2026-01-22 22:23:28.110 182729 DEBUG nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <uuid>3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</uuid>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <name>instance-0000002d</name>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <nova:name>tempest-MigrationsAdminTest-server-1010543436</nova:name>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:23:27</nova:creationTime>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:23:28 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:23:28 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:23:28 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:23:28 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:23:28 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:23:28 compute-0 nova_compute[182725]:         <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 22:23:28 compute-0 nova_compute[182725]:         <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <nova:ports/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <system>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <entry name="serial">3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</entry>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <entry name="uuid">3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</entry>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </system>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <os>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   </os>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <features>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   </features>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/console.log" append="off"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <video>
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </video>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:23:28 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:23:28 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:23:28 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:23:28 compute-0 nova_compute[182725]: </domain>
Jan 22 22:23:28 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:23:28 compute-0 systemd-machined[154006]: New machine qemu-20-instance-0000002d.
Jan 22 22:23:28 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000002d.
Jan 22 22:23:28 compute-0 nova_compute[182725]: 2026-01-22 22:23:28.664 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.106 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.107 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120609.1053123, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.108 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Resumed (Lifecycle Event)
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.112 182729 DEBUG nova.compute.manager [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.117 182729 INFO nova.virt.libvirt.driver [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance running successfully.
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.118 182729 DEBUG nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.129 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.134 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.149 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.150 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120609.1078324, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.150 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Started (Lifecycle Event)
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.182 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.186 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.210 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 22:23:29 compute-0 nova_compute[182725]: 2026-01-22 22:23:29.223 182729 INFO nova.compute.manager [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating instance to original state: 'active'
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.398 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.399 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.400 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.400 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.401 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.417 182729 INFO nova.compute.manager [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Terminating instance
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.431 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.432 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.432 182729 DEBUG nova.network.neutron [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:23:30 compute-0 nova_compute[182725]: 2026-01-22 22:23:30.915 182729 DEBUG nova.network.neutron [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:23:31 compute-0 podman[216912]: 2026-01-22 22:23:31.164053236 +0000 UTC m=+0.087956404 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.258 182729 DEBUG nova.network.neutron [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.284 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.285 182729 DEBUG nova.compute.manager [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:23:31 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 22 22:23:31 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002d.scope: Consumed 3.180s CPU time.
Jan 22 22:23:31 compute-0 systemd-machined[154006]: Machine qemu-20-instance-0000002d terminated.
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.567 182729 INFO nova.virt.libvirt.driver [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance destroyed successfully.
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.568 182729 DEBUG nova.objects.instance [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'resources' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.581 182729 INFO nova.virt.libvirt.driver [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Deleting instance files /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_del
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.591 182729 INFO nova.virt.libvirt.driver [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Deletion of /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_del complete
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.796 182729 INFO nova.compute.manager [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Took 0.51 seconds to destroy the instance on the hypervisor.
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.798 182729 DEBUG oslo.service.loopingcall [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.798 182729 DEBUG nova.compute.manager [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.799 182729 DEBUG nova.network.neutron [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.925 182729 DEBUG nova.network.neutron [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.939 182729 DEBUG nova.network.neutron [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:31 compute-0 nova_compute[182725]: 2026-01-22 22:23:31.977 182729 INFO nova.compute.manager [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Took 0.18 seconds to deallocate network for instance.
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.078 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.079 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.185 182729 DEBUG nova.compute.provider_tree [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.204 182729 DEBUG nova.scheduler.client.report [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.239 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.270 182729 INFO nova.scheduler.client.report [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Deleted allocations for instance 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.343 182729 DEBUG oslo_concurrency.lockutils [None req-a500be7e-2bb0-4395-9fd0-a0cebcea8b0a 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.944s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:32 compute-0 nova_compute[182725]: 2026-01-22 22:23:32.966 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:33 compute-0 podman[216942]: 2026-01-22 22:23:33.149944766 +0000 UTC m=+0.079961286 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=)
Jan 22 22:23:33 compute-0 podman[216941]: 2026-01-22 22:23:33.177067206 +0000 UTC m=+0.104527553 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:23:33 compute-0 nova_compute[182725]: 2026-01-22 22:23:33.665 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:35 compute-0 nova_compute[182725]: 2026-01-22 22:23:35.761 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:35 compute-0 nova_compute[182725]: 2026-01-22 22:23:35.761 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:35 compute-0 nova_compute[182725]: 2026-01-22 22:23:35.783 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:23:35 compute-0 nova_compute[182725]: 2026-01-22 22:23:35.884 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:35 compute-0 nova_compute[182725]: 2026-01-22 22:23:35.884 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:35 compute-0 nova_compute[182725]: 2026-01-22 22:23:35.891 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:23:35 compute-0 nova_compute[182725]: 2026-01-22 22:23:35.892 182729 INFO nova.compute.claims [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.071 182729 DEBUG nova.compute.provider_tree [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.102 182729 DEBUG nova.scheduler.client.report [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:23:36 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:23:36 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.124 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.125 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.192 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.193 182729 DEBUG nova.network.neutron [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.217 182729 INFO nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.247 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.359 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.361 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.362 182729 INFO nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Creating image(s)
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.364 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "/var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.365 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.366 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.394 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.474 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.476 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.477 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.501 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.542 182729 DEBUG nova.policy [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d9fe7f0e8b4edf92fa2064aaab8bca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3a2ee662fba426c8f688455b20759bf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.588 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.589 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.654 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk 1073741824" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.656 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.657 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.736 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.738 182729 DEBUG nova.virt.disk.api [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Checking if we can resize image /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.739 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.811 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.813 182729 DEBUG nova.virt.disk.api [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Cannot resize image /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.814 182729 DEBUG nova.objects.instance [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'migration_context' on Instance uuid 5d4456c4-888d-4a4f-b820-b7eed8f26b8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.833 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.834 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Ensure instance console log exists: /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.835 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.836 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:36 compute-0 nova_compute[182725]: 2026-01-22 22:23:36.836 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:37 compute-0 nova_compute[182725]: 2026-01-22 22:23:37.631 182729 DEBUG nova.network.neutron [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Successfully created port: c3986942-6b78-4dab-8b76-b23ae6f1fb0e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:23:37 compute-0 nova_compute[182725]: 2026-01-22 22:23:37.968 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:38 compute-0 nova_compute[182725]: 2026-01-22 22:23:38.667 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:39 compute-0 nova_compute[182725]: 2026-01-22 22:23:39.567 182729 DEBUG nova.network.neutron [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Successfully updated port: c3986942-6b78-4dab-8b76-b23ae6f1fb0e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:23:39 compute-0 nova_compute[182725]: 2026-01-22 22:23:39.583 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "refresh_cache-5d4456c4-888d-4a4f-b820-b7eed8f26b8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:23:39 compute-0 nova_compute[182725]: 2026-01-22 22:23:39.583 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquired lock "refresh_cache-5d4456c4-888d-4a4f-b820-b7eed8f26b8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:23:39 compute-0 nova_compute[182725]: 2026-01-22 22:23:39.583 182729 DEBUG nova.network.neutron [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:23:39 compute-0 nova_compute[182725]: 2026-01-22 22:23:39.991 182729 DEBUG nova.network.neutron [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:23:40 compute-0 nova_compute[182725]: 2026-01-22 22:23:40.546 182729 DEBUG nova.compute.manager [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received event network-changed-c3986942-6b78-4dab-8b76-b23ae6f1fb0e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:23:40 compute-0 nova_compute[182725]: 2026-01-22 22:23:40.547 182729 DEBUG nova.compute.manager [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Refreshing instance network info cache due to event network-changed-c3986942-6b78-4dab-8b76-b23ae6f1fb0e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:23:40 compute-0 nova_compute[182725]: 2026-01-22 22:23:40.547 182729 DEBUG oslo_concurrency.lockutils [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5d4456c4-888d-4a4f-b820-b7eed8f26b8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:23:40 compute-0 nova_compute[182725]: 2026-01-22 22:23:40.989 182729 DEBUG nova.network.neutron [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Updating instance_info_cache with network_info: [{"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.064 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Releasing lock "refresh_cache-5d4456c4-888d-4a4f-b820-b7eed8f26b8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.065 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Instance network_info: |[{"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.067 182729 DEBUG oslo_concurrency.lockutils [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5d4456c4-888d-4a4f-b820-b7eed8f26b8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.067 182729 DEBUG nova.network.neutron [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Refreshing network info cache for port c3986942-6b78-4dab-8b76-b23ae6f1fb0e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.073 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Start _get_guest_xml network_info=[{"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.081 182729 WARNING nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.086 182729 DEBUG nova.virt.libvirt.host [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.087 182729 DEBUG nova.virt.libvirt.host [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.091 182729 DEBUG nova.virt.libvirt.host [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.092 182729 DEBUG nova.virt.libvirt.host [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.094 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.094 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.095 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.096 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.096 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.097 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.097 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.098 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.098 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.099 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.099 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.100 182729 DEBUG nova.virt.hardware [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.107 182729 DEBUG nova.virt.libvirt.vif [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-314735179',display_name='tempest-ImagesTestJSON-server-314735179',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-314735179',id=48,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-ln28ieri',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:23:36Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=5d4456c4-888d-4a4f-b820-b7eed8f26b8b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.108 182729 DEBUG nova.network.os_vif_util [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.109 182729 DEBUG nova.network.os_vif_util [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:34:52,bridge_name='br-int',has_traffic_filtering=True,id=c3986942-6b78-4dab-8b76-b23ae6f1fb0e,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3986942-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.111 182729 DEBUG nova.objects.instance [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 5d4456c4-888d-4a4f-b820-b7eed8f26b8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.137 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <uuid>5d4456c4-888d-4a4f-b820-b7eed8f26b8b</uuid>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <name>instance-00000030</name>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <nova:name>tempest-ImagesTestJSON-server-314735179</nova:name>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:23:41</nova:creationTime>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:user uuid="52d9fe7f0e8b4edf92fa2064aaab8bca">tempest-ImagesTestJSON-23148374-project-member</nova:user>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:project uuid="d3a2ee662fba426c8f688455b20759bf">tempest-ImagesTestJSON-23148374</nova:project>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         <nova:port uuid="c3986942-6b78-4dab-8b76-b23ae6f1fb0e">
Jan 22 22:23:41 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <system>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <entry name="serial">5d4456c4-888d-4a4f-b820-b7eed8f26b8b</entry>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <entry name="uuid">5d4456c4-888d-4a4f-b820-b7eed8f26b8b</entry>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </system>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <os>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   </os>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <features>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   </features>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk.config"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:c0:34:52"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <target dev="tapc3986942-6b"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/console.log" append="off"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <video>
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </video>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:23:41 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:23:41 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:23:41 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:23:41 compute-0 nova_compute[182725]: </domain>
Jan 22 22:23:41 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.139 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Preparing to wait for external event network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.139 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.140 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.140 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.141 182729 DEBUG nova.virt.libvirt.vif [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-314735179',display_name='tempest-ImagesTestJSON-server-314735179',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-314735179',id=48,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-ln28ieri',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:23:36Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=5d4456c4-888d-4a4f-b820-b7eed8f26b8b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.141 182729 DEBUG nova.network.os_vif_util [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.141 182729 DEBUG nova.network.os_vif_util [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:34:52,bridge_name='br-int',has_traffic_filtering=True,id=c3986942-6b78-4dab-8b76-b23ae6f1fb0e,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3986942-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.142 182729 DEBUG os_vif [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:34:52,bridge_name='br-int',has_traffic_filtering=True,id=c3986942-6b78-4dab-8b76-b23ae6f1fb0e,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3986942-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.143 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.143 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.143 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.146 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.147 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3986942-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.147 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3986942-6b, col_values=(('external_ids', {'iface-id': 'c3986942-6b78-4dab-8b76-b23ae6f1fb0e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:34:52', 'vm-uuid': '5d4456c4-888d-4a4f-b820-b7eed8f26b8b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.149 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 NetworkManager[54954]: <info>  [1769120621.1512] manager: (tapc3986942-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.152 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.160 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.162 182729 INFO os_vif [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:34:52,bridge_name='br-int',has_traffic_filtering=True,id=c3986942-6b78-4dab-8b76-b23ae6f1fb0e,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3986942-6b')
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.227 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.227 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.227 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No VIF found with MAC fa:16:3e:c0:34:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.228 182729 INFO nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Using config drive
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.274 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.274 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.276 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.322 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.323 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.324 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.325 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.325 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.341 182729 INFO nova.compute.manager [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Terminating instance
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.355 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.355 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.356 182729 DEBUG nova.network.neutron [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.573 182729 INFO nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Creating config drive at /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk.config
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.579 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxs8rxl9o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.607 182729 DEBUG nova.network.neutron [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.715 182729 DEBUG oslo_concurrency.processutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxs8rxl9o" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:41 compute-0 kernel: tapc3986942-6b: entered promiscuous mode
Jan 22 22:23:41 compute-0 NetworkManager[54954]: <info>  [1769120621.8004] manager: (tapc3986942-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 22 22:23:41 compute-0 ovn_controller[94850]: 2026-01-22T22:23:41Z|00131|binding|INFO|Claiming lport c3986942-6b78-4dab-8b76-b23ae6f1fb0e for this chassis.
Jan 22 22:23:41 compute-0 ovn_controller[94850]: 2026-01-22T22:23:41Z|00132|binding|INFO|c3986942-6b78-4dab-8b76-b23ae6f1fb0e: Claiming fa:16:3e:c0:34:52 10.100.0.8
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.799 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.804 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.816 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:34:52 10.100.0.8'], port_security=['fa:16:3e:c0:34:52 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d4456c4-888d-4a4f-b820-b7eed8f26b8b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c3986942-6b78-4dab-8b76-b23ae6f1fb0e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.818 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c3986942-6b78-4dab-8b76-b23ae6f1fb0e in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 bound to our chassis
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.821 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.846 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[26bd201c-7cc6-4083-a221-7c29bcf5baa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.847 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd5f6392-b1 in ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.850 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd5f6392-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.850 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb2bcee-3037-4c78-99fc-a28bbca53956]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.851 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[687eb940-2284-4214-8f59-3514de08b646]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 systemd-udevd[217026]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.867 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[975e88b8-220a-48af-b351-1c51068d01e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.876 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 systemd-machined[154006]: New machine qemu-21-instance-00000030.
Jan 22 22:23:41 compute-0 ovn_controller[94850]: 2026-01-22T22:23:41Z|00133|binding|INFO|Setting lport c3986942-6b78-4dab-8b76-b23ae6f1fb0e ovn-installed in OVS
Jan 22 22:23:41 compute-0 ovn_controller[94850]: 2026-01-22T22:23:41Z|00134|binding|INFO|Setting lport c3986942-6b78-4dab-8b76-b23ae6f1fb0e up in Southbound
Jan 22 22:23:41 compute-0 nova_compute[182725]: 2026-01-22 22:23:41.883 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.888 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8486c344-e224-45bd-b7b6-914e27163dd3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000030.
Jan 22 22:23:41 compute-0 NetworkManager[54954]: <info>  [1769120621.8944] device (tapc3986942-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:23:41 compute-0 NetworkManager[54954]: <info>  [1769120621.8978] device (tapc3986942-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.932 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[16e10c5e-f9fb-4775-8c40-6940370fb246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.939 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[201f5344-160e-4daf-9cad-291ea62ab5ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 NetworkManager[54954]: <info>  [1769120621.9419] manager: (tapdd5f6392-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Jan 22 22:23:41 compute-0 systemd-udevd[217029]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.979 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf93c54-0bcd-4cfa-b263-c54cd3a545c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:41.983 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c2532a46-f0e1-46c5-ad12-09678dcd8bd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 NetworkManager[54954]: <info>  [1769120622.0141] device (tapdd5f6392-b0): carrier: link connected
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.024 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[cb97b718-f74d-4296-a614-fd116fca7f38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.050 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e661d290-da57-4f49-a6f8-b4db655c6c3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425961, 'reachable_time': 15900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217057, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.073 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b1133273-e115-40e9-9123-9615c4d5e57c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:d723'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425961, 'tstamp': 425961}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217058, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.102 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[61fbc19d-a15b-4524-9a80-3a20eb2a29a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425961, 'reachable_time': 15900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217059, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.107 182729 DEBUG nova.network.neutron [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.128 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.129 182729 DEBUG nova.compute.manager [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.156 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b60aed02-714a-4b33-a564-ba7ad249851c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.168 182729 DEBUG nova.compute.manager [req-3434731a-7965-4505-9e1c-5bc25e623fb7 req-4833abcf-e8a9-49ad-8c2a-069229be22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received event network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.169 182729 DEBUG oslo_concurrency.lockutils [req-3434731a-7965-4505-9e1c-5bc25e623fb7 req-4833abcf-e8a9-49ad-8c2a-069229be22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.169 182729 DEBUG oslo_concurrency.lockutils [req-3434731a-7965-4505-9e1c-5bc25e623fb7 req-4833abcf-e8a9-49ad-8c2a-069229be22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.169 182729 DEBUG oslo_concurrency.lockutils [req-3434731a-7965-4505-9e1c-5bc25e623fb7 req-4833abcf-e8a9-49ad-8c2a-069229be22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.170 182729 DEBUG nova.compute.manager [req-3434731a-7965-4505-9e1c-5bc25e623fb7 req-4833abcf-e8a9-49ad-8c2a-069229be22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Processing event network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:23:42 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 22 22:23:42 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000025.scope: Consumed 16.983s CPU time.
Jan 22 22:23:42 compute-0 systemd-machined[154006]: Machine qemu-16-instance-00000025 terminated.
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.249 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9a006e66-39c9-4b45-918b-90dda6b782b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.251 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.251 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.252 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd5f6392-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:23:42 compute-0 NetworkManager[54954]: <info>  [1769120622.2552] manager: (tapdd5f6392-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:42 compute-0 kernel: tapdd5f6392-b0: entered promiscuous mode
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.259 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.260 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd5f6392-b0, col_values=(('external_ids', {'iface-id': 'c2b5e191-6c34-4707-83d4-b3c5bc12ff1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.261 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:42 compute-0 ovn_controller[94850]: 2026-01-22T22:23:42Z|00135|binding|INFO|Releasing lport c2b5e191-6c34-4707-83d4-b3c5bc12ff1e from this chassis (sb_readonly=0)
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.287 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.289 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.291 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[beecc1fb-30b8-4d70-96bc-2e69070f7c94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.292 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:23:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:42.293 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'env', 'PROCESS_TAG=haproxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd5f6392-bfb2-42bf-a825-c0516c8891b0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.408 182729 INFO nova.virt.libvirt.driver [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance destroyed successfully.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.408 182729 DEBUG nova.objects.instance [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'resources' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.426 182729 INFO nova.virt.libvirt.driver [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Deleting instance files /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_del
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.434 182729 INFO nova.virt.libvirt.driver [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Deletion of /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_del complete
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.492 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120622.4915533, 5d4456c4-888d-4a4f-b820-b7eed8f26b8b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.492 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] VM Started (Lifecycle Event)
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.494 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.498 182729 INFO nova.compute.manager [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.499 182729 DEBUG oslo.service.loopingcall [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.499 182729 DEBUG nova.compute.manager [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.499 182729 DEBUG nova.network.neutron [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.503 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.515 182729 INFO nova.virt.libvirt.driver [-] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Instance spawned successfully.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.515 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.520 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.529 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.541 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.542 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.543 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.544 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.544 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.545 182729 DEBUG nova.virt.libvirt.driver [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.553 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.555 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120622.491827, 5d4456c4-888d-4a4f-b820-b7eed8f26b8b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.555 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] VM Paused (Lifecycle Event)
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.580 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.585 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120622.498445, 5d4456c4-888d-4a4f-b820-b7eed8f26b8b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.585 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] VM Resumed (Lifecycle Event)
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.607 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.612 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.635 182729 INFO nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Took 6.28 seconds to spawn the instance on the hypervisor.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.636 182729 DEBUG nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.637 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:23:42 compute-0 podman[217107]: 2026-01-22 22:23:42.705744557 +0000 UTC m=+0.060557386 container create b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.719 182729 INFO nova.compute.manager [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Took 6.88 seconds to build instance.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.721 182729 DEBUG nova.network.neutron [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.738 182729 DEBUG oslo_concurrency.lockutils [None req-f4c4d098-cbc5-4eac-8384-ac8e8a17eb71 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.744 182729 DEBUG nova.network.neutron [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:42 compute-0 systemd[1]: Started libpod-conmon-b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9.scope.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.759 182729 INFO nova.compute.manager [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Took 0.26 seconds to deallocate network for instance.
Jan 22 22:23:42 compute-0 podman[217107]: 2026-01-22 22:23:42.667620206 +0000 UTC m=+0.022433055 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:23:42 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:23:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01ad3e111c09e57274ddacb1b4f75a561e66f7b179815a7e8e85093a6911bfb8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:23:42 compute-0 podman[217107]: 2026-01-22 22:23:42.830321194 +0000 UTC m=+0.185134123 container init b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:23:42 compute-0 podman[217107]: 2026-01-22 22:23:42.836510997 +0000 UTC m=+0.191323866 container start b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.854 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.855 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:42 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [NOTICE]   (217126) : New worker (217128) forked
Jan 22 22:23:42 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [NOTICE]   (217126) : Loading success.
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.869 182729 DEBUG nova.network.neutron [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Updated VIF entry in instance network info cache for port c3986942-6b78-4dab-8b76-b23ae6f1fb0e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.870 182729 DEBUG nova.network.neutron [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Updating instance_info_cache with network_info: [{"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.895 182729 DEBUG oslo_concurrency.lockutils [req-2706ad0d-d8bc-4c6b-afa2-8f18a46d492d req-3bb85bb0-95fd-4d67-8fe5-9851317199d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5d4456c4-888d-4a4f-b820-b7eed8f26b8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.945 182729 DEBUG nova.compute.provider_tree [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:23:42 compute-0 nova_compute[182725]: 2026-01-22 22:23:42.971 182729 DEBUG nova.scheduler.client.report [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:23:43 compute-0 nova_compute[182725]: 2026-01-22 22:23:43.006 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:43 compute-0 nova_compute[182725]: 2026-01-22 22:23:43.041 182729 INFO nova.scheduler.client.report [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Deleted allocations for instance 79166459-7b8b-44ed-8dba-0ba4cb9d97ff
Jan 22 22:23:43 compute-0 nova_compute[182725]: 2026-01-22 22:23:43.130 182729 DEBUG oslo_concurrency.lockutils [None req-ed1d9dbe-4fed-4110-beec-12e5dd921da0 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:43 compute-0 nova_compute[182725]: 2026-01-22 22:23:43.670 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:43 compute-0 nova_compute[182725]: 2026-01-22 22:23:43.928 182729 DEBUG nova.compute.manager [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.045 182729 INFO nova.compute.manager [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] instance snapshotting
Jan 22 22:23:44 compute-0 podman[217137]: 2026-01-22 22:23:44.151325292 +0000 UTC m=+0.077993447 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 22:23:44 compute-0 podman[217138]: 2026-01-22 22:23:44.1661957 +0000 UTC m=+0.083824762 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.399 182729 DEBUG nova.compute.manager [req-30166d48-103d-4637-b3ab-affe66b1d4ab req-84ff9cfe-11eb-484b-9ff1-24bd2c707ef8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received event network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.399 182729 DEBUG oslo_concurrency.lockutils [req-30166d48-103d-4637-b3ab-affe66b1d4ab req-84ff9cfe-11eb-484b-9ff1-24bd2c707ef8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.399 182729 DEBUG oslo_concurrency.lockutils [req-30166d48-103d-4637-b3ab-affe66b1d4ab req-84ff9cfe-11eb-484b-9ff1-24bd2c707ef8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.400 182729 DEBUG oslo_concurrency.lockutils [req-30166d48-103d-4637-b3ab-affe66b1d4ab req-84ff9cfe-11eb-484b-9ff1-24bd2c707ef8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.400 182729 DEBUG nova.compute.manager [req-30166d48-103d-4637-b3ab-affe66b1d4ab req-84ff9cfe-11eb-484b-9ff1-24bd2c707ef8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] No waiting events found dispatching network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.400 182729 WARNING nova.compute.manager [req-30166d48-103d-4637-b3ab-affe66b1d4ab req-84ff9cfe-11eb-484b-9ff1-24bd2c707ef8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received unexpected event network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e for instance with vm_state active and task_state image_snapshot.
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.669 182729 INFO nova.virt.libvirt.driver [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Beginning live snapshot process
Jan 22 22:23:44 compute-0 virtqemud[182297]: invalid argument: disk vda does not have an active block job
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.853 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.961 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk --force-share --output=json -f qcow2" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:44 compute-0 nova_compute[182725]: 2026-01-22 22:23:44.962 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.024 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.047 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.119 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.121 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5jjb6y_k/4078ea9a8d404d7197de1761c35f9ce9.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.169 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5jjb6y_k/4078ea9a8d404d7197de1761c35f9ce9.delta 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.171 182729 INFO nova.virt.libvirt.driver [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.229 182729 DEBUG nova.virt.libvirt.guest [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.233 182729 INFO nova.virt.libvirt.driver [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.270 182729 DEBUG nova.privsep.utils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.271 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5jjb6y_k/4078ea9a8d404d7197de1761c35f9ce9.delta /var/lib/nova/instances/snapshots/tmp5jjb6y_k/4078ea9a8d404d7197de1761c35f9ce9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.431 182729 DEBUG oslo_concurrency.processutils [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5jjb6y_k/4078ea9a8d404d7197de1761c35f9ce9.delta /var/lib/nova/instances/snapshots/tmp5jjb6y_k/4078ea9a8d404d7197de1761c35f9ce9" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:23:45 compute-0 nova_compute[182725]: 2026-01-22 22:23:45.432 182729 INFO nova.virt.libvirt.driver [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Snapshot extracted, beginning image upload
Jan 22 22:23:46 compute-0 nova_compute[182725]: 2026-01-22 22:23:46.150 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:46 compute-0 nova_compute[182725]: 2026-01-22 22:23:46.564 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120611.562532, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:46 compute-0 nova_compute[182725]: 2026-01-22 22:23:46.565 182729 INFO nova.compute.manager [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Stopped (Lifecycle Event)
Jan 22 22:23:46 compute-0 nova_compute[182725]: 2026-01-22 22:23:46.605 182729 DEBUG nova.compute.manager [None req-f8872072-c809-4aa6-b9d8-41a18b6bdcdb - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:23:47.278 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:23:48 compute-0 nova_compute[182725]: 2026-01-22 22:23:48.157 182729 INFO nova.virt.libvirt.driver [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Snapshot image upload complete
Jan 22 22:23:48 compute-0 nova_compute[182725]: 2026-01-22 22:23:48.158 182729 INFO nova.compute.manager [None req-8f7fe91a-71a1-4b2e-a688-bc50667446f0 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Took 4.10 seconds to snapshot the instance on the hypervisor.
Jan 22 22:23:48 compute-0 podman[217206]: 2026-01-22 22:23:48.182671914 +0000 UTC m=+0.106051781 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:23:48 compute-0 nova_compute[182725]: 2026-01-22 22:23:48.673 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:51 compute-0 nova_compute[182725]: 2026-01-22 22:23:51.154 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:53 compute-0 nova_compute[182725]: 2026-01-22 22:23:53.675 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:54 compute-0 ovn_controller[94850]: 2026-01-22T22:23:54Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:34:52 10.100.0.8
Jan 22 22:23:54 compute-0 ovn_controller[94850]: 2026-01-22T22:23:54Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:34:52 10.100.0.8
Jan 22 22:23:56 compute-0 nova_compute[182725]: 2026-01-22 22:23:56.156 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:23:57 compute-0 nova_compute[182725]: 2026-01-22 22:23:57.405 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120622.4036522, 79166459-7b8b-44ed-8dba-0ba4cb9d97ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:23:57 compute-0 nova_compute[182725]: 2026-01-22 22:23:57.406 182729 INFO nova.compute.manager [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] VM Stopped (Lifecycle Event)
Jan 22 22:23:57 compute-0 nova_compute[182725]: 2026-01-22 22:23:57.427 182729 DEBUG nova.compute.manager [None req-278af3a6-3522-40b9-8394-464607d725c7 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:23:58 compute-0 nova_compute[182725]: 2026-01-22 22:23:58.679 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:01 compute-0 nova_compute[182725]: 2026-01-22 22:24:01.158 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:02 compute-0 podman[217247]: 2026-01-22 22:24:02.139639399 +0000 UTC m=+0.068465832 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:24:03 compute-0 nova_compute[182725]: 2026-01-22 22:24:03.682 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:03 compute-0 nova_compute[182725]: 2026-01-22 22:24:03.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:03 compute-0 nova_compute[182725]: 2026-01-22 22:24:03.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:24:04 compute-0 podman[217269]: 2026-01-22 22:24:04.138337105 +0000 UTC m=+0.066766410 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 22:24:04 compute-0 podman[217268]: 2026-01-22 22:24:04.177364079 +0000 UTC m=+0.111948166 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 22:24:05 compute-0 nova_compute[182725]: 2026-01-22 22:24:05.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:06 compute-0 nova_compute[182725]: 2026-01-22 22:24:06.161 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.518 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.519 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.519 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.519 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.520 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.531 182729 INFO nova.compute.manager [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Terminating instance
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.543 182729 DEBUG nova.compute.manager [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:24:07 compute-0 kernel: tapc3986942-6b (unregistering): left promiscuous mode
Jan 22 22:24:07 compute-0 NetworkManager[54954]: <info>  [1769120647.5673] device (tapc3986942-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:24:07 compute-0 ovn_controller[94850]: 2026-01-22T22:24:07Z|00136|binding|INFO|Releasing lport c3986942-6b78-4dab-8b76-b23ae6f1fb0e from this chassis (sb_readonly=0)
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.583 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 ovn_controller[94850]: 2026-01-22T22:24:07Z|00137|binding|INFO|Setting lport c3986942-6b78-4dab-8b76-b23ae6f1fb0e down in Southbound
Jan 22 22:24:07 compute-0 ovn_controller[94850]: 2026-01-22T22:24:07Z|00138|binding|INFO|Removing iface tapc3986942-6b ovn-installed in OVS
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.588 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.594 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:34:52 10.100.0.8'], port_security=['fa:16:3e:c0:34:52 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d4456c4-888d-4a4f-b820-b7eed8f26b8b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c3986942-6b78-4dab-8b76-b23ae6f1fb0e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.595 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c3986942-6b78-4dab-8b76-b23ae6f1fb0e in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 unbound from our chassis
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.596 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd5f6392-bfb2-42bf-a825-c0516c8891b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.598 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f86f9a72-7869-4ab5-9efa-39e2b51c9d7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.600 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace which is not needed anymore
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.602 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 22 22:24:07 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000030.scope: Consumed 13.166s CPU time.
Jan 22 22:24:07 compute-0 systemd-machined[154006]: Machine qemu-21-instance-00000030 terminated.
Jan 22 22:24:07 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [NOTICE]   (217126) : haproxy version is 2.8.14-c23fe91
Jan 22 22:24:07 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [NOTICE]   (217126) : path to executable is /usr/sbin/haproxy
Jan 22 22:24:07 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [WARNING]  (217126) : Exiting Master process...
Jan 22 22:24:07 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [WARNING]  (217126) : Exiting Master process...
Jan 22 22:24:07 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [ALERT]    (217126) : Current worker (217128) exited with code 143 (Terminated)
Jan 22 22:24:07 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217122]: [WARNING]  (217126) : All workers exited. Exiting... (0)
Jan 22 22:24:07 compute-0 systemd[1]: libpod-b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9.scope: Deactivated successfully.
Jan 22 22:24:07 compute-0 podman[217339]: 2026-01-22 22:24:07.749055228 +0000 UTC m=+0.050448637 container died b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.772 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.779 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9-userdata-shm.mount: Deactivated successfully.
Jan 22 22:24:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-01ad3e111c09e57274ddacb1b4f75a561e66f7b179815a7e8e85093a6911bfb8-merged.mount: Deactivated successfully.
Jan 22 22:24:07 compute-0 podman[217339]: 2026-01-22 22:24:07.80338101 +0000 UTC m=+0.104774419 container cleanup b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:24:07 compute-0 systemd[1]: libpod-conmon-b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9.scope: Deactivated successfully.
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.820 182729 INFO nova.virt.libvirt.driver [-] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Instance destroyed successfully.
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.821 182729 DEBUG nova.objects.instance [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'resources' on Instance uuid 5d4456c4-888d-4a4f-b820-b7eed8f26b8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.845 182729 DEBUG nova.virt.libvirt.vif [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-314735179',display_name='tempest-ImagesTestJSON-server-314735179',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-314735179',id=48,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:23:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-ln28ieri',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:23:48Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=5d4456c4-888d-4a4f-b820-b7eed8f26b8b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.846 182729 DEBUG nova.network.os_vif_util [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "address": "fa:16:3e:c0:34:52", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3986942-6b", "ovs_interfaceid": "c3986942-6b78-4dab-8b76-b23ae6f1fb0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.847 182729 DEBUG nova.network.os_vif_util [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:34:52,bridge_name='br-int',has_traffic_filtering=True,id=c3986942-6b78-4dab-8b76-b23ae6f1fb0e,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3986942-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.847 182729 DEBUG os_vif [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:34:52,bridge_name='br-int',has_traffic_filtering=True,id=c3986942-6b78-4dab-8b76-b23ae6f1fb0e,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3986942-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.850 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.850 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3986942-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.852 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.855 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.858 182729 INFO os_vif [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:34:52,bridge_name='br-int',has_traffic_filtering=True,id=c3986942-6b78-4dab-8b76-b23ae6f1fb0e,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3986942-6b')
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.858 182729 INFO nova.virt.libvirt.driver [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Deleting instance files /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b_del
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.859 182729 INFO nova.virt.libvirt.driver [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Deletion of /var/lib/nova/instances/5d4456c4-888d-4a4f-b820-b7eed8f26b8b_del complete
Jan 22 22:24:07 compute-0 podman[217380]: 2026-01-22 22:24:07.878747831 +0000 UTC m=+0.045671549 container remove b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.884 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4e5dbf-7454-4dc4-bc62-f6647fa85944]: (4, ('Thu Jan 22 10:24:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9)\nb376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9\nThu Jan 22 10:24:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (b376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9)\nb376a3705a1bfbc9f3765df56eb34ded4127a10aba1d0f9e4bb53685e702ddf9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.887 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[debf1d72-f524-4974-9c58-2b5c64a4234a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.888 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:07 compute-0 kernel: tapdd5f6392-b0: left promiscuous mode
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.890 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.905 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.910 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[da51bde5-7fb2-44a7-94ca-4065adb58b0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.924 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[08d6ff13-196a-4c5f-ad30-e8109a67b43f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.927 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0b20bc-c730-42f1-9579-50ed1703e2c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.942 182729 INFO nova.compute.manager [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.943 182729 DEBUG oslo.service.loopingcall [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.943 182729 DEBUG nova.compute.manager [-] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:24:07 compute-0 nova_compute[182725]: 2026-01-22 22:24:07.943 182729 DEBUG nova.network.neutron [-] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.952 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbab50f-31ec-4cc5-9300-f40a20d66b7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425952, 'reachable_time': 38553, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217395, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.956 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:24:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:07.956 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9a5ab635-078e-41a0-bb37-b11841670772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:07 compute-0 systemd[1]: run-netns-ovnmeta\x2ddd5f6392\x2dbfb2\x2d42bf\x2da825\x2dc0516c8891b0.mount: Deactivated successfully.
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.512 182729 DEBUG nova.compute.manager [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received event network-vif-unplugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.513 182729 DEBUG oslo_concurrency.lockutils [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.514 182729 DEBUG oslo_concurrency.lockutils [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.514 182729 DEBUG oslo_concurrency.lockutils [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.514 182729 DEBUG nova.compute.manager [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] No waiting events found dispatching network-vif-unplugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.514 182729 DEBUG nova.compute.manager [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received event network-vif-unplugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.514 182729 DEBUG nova.compute.manager [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received event network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.515 182729 DEBUG oslo_concurrency.lockutils [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.515 182729 DEBUG oslo_concurrency.lockutils [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.515 182729 DEBUG oslo_concurrency.lockutils [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.515 182729 DEBUG nova.compute.manager [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] No waiting events found dispatching network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.515 182729 WARNING nova.compute.manager [req-7fd48f5e-7bd9-48ab-9680-62f08b633401 req-0e83ac1c-7a71-4548-8148-c4d8b3710f4b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received unexpected event network-vif-plugged-c3986942-6b78-4dab-8b76-b23ae6f1fb0e for instance with vm_state active and task_state deleting.
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.687 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.705 182729 DEBUG nova.network.neutron [-] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.733 182729 INFO nova.compute.manager [-] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Took 0.79 seconds to deallocate network for instance.
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.835 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:08 compute-0 nova_compute[182725]: 2026-01-22 22:24:08.837 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:09 compute-0 nova_compute[182725]: 2026-01-22 22:24:09.036 182729 DEBUG nova.compute.provider_tree [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:24:09 compute-0 nova_compute[182725]: 2026-01-22 22:24:09.054 182729 DEBUG nova.scheduler.client.report [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:24:09 compute-0 nova_compute[182725]: 2026-01-22 22:24:09.090 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:24:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:24:09 compute-0 nova_compute[182725]: 2026-01-22 22:24:09.190 182729 INFO nova.scheduler.client.report [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Deleted allocations for instance 5d4456c4-888d-4a4f-b820-b7eed8f26b8b
Jan 22 22:24:09 compute-0 nova_compute[182725]: 2026-01-22 22:24:09.302 182729 DEBUG oslo_concurrency.lockutils [None req-13506dc1-b775-42a2-b142-d17e1e235fbb 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5d4456c4-888d-4a4f-b820-b7eed8f26b8b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:10 compute-0 nova_compute[182725]: 2026-01-22 22:24:10.642 182729 DEBUG nova.compute.manager [req-31a0a741-f36e-4056-bfff-1ec026133602 req-0418bddd-af73-4fa1-b28e-4f0834062e6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Received event network-vif-deleted-c3986942-6b78-4dab-8b76-b23ae6f1fb0e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:10 compute-0 nova_compute[182725]: 2026-01-22 22:24:10.897 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.794 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.795 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.828 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.929 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.930 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.930 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.930 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.932 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.932 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.940 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:24:11 compute-0 nova_compute[182725]: 2026-01-22 22:24:11.940 182729 INFO nova.compute.claims [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.094 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.095 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5660MB free_disk=73.37660217285156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.096 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.264 182729 DEBUG nova.compute.provider_tree [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.287 182729 DEBUG nova.scheduler.client.report [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.305 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.306 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.308 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.383 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 384ca0fc-ad57-4f20-be7c-486868e52dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.384 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.384 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.386 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.387 182729 DEBUG nova.network.neutron [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.406 182729 INFO nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.429 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:24:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:12.431 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:12.433 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:12.433 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.436 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.453 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.480 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.481 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.535 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.537 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.538 182729 INFO nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Creating image(s)
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.539 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "/var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.539 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.540 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.566 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.634 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.635 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.636 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.647 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.679 182729 DEBUG nova.policy [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d9fe7f0e8b4edf92fa2064aaab8bca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3a2ee662fba426c8f688455b20759bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.717 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.717 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.757 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.758 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.758 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.815 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.817 182729 DEBUG nova.virt.disk.api [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Checking if we can resize image /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.817 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.854 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.909 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.910 182729 DEBUG nova.virt.disk.api [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Cannot resize image /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.911 182729 DEBUG nova.objects.instance [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'migration_context' on Instance uuid 384ca0fc-ad57-4f20-be7c-486868e52dc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.927 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.928 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Ensure instance console log exists: /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.929 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.929 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:12 compute-0 nova_compute[182725]: 2026-01-22 22:24:12.929 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.419 182729 DEBUG nova.network.neutron [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Successfully created port: 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.482 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.482 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.482 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.499 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.499 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.499 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.500 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.500 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.500 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:24:13 compute-0 nova_compute[182725]: 2026-01-22 22:24:13.689 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.613 182729 DEBUG nova.network.neutron [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Successfully updated port: 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.653 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "refresh_cache-384ca0fc-ad57-4f20-be7c-486868e52dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.654 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquired lock "refresh_cache-384ca0fc-ad57-4f20-be7c-486868e52dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.654 182729 DEBUG nova.network.neutron [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.806 182729 DEBUG nova.compute.manager [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received event network-changed-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.807 182729 DEBUG nova.compute.manager [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Refreshing instance network info cache due to event network-changed-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.807 182729 DEBUG oslo_concurrency.lockutils [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-384ca0fc-ad57-4f20-be7c-486868e52dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:24:14 compute-0 nova_compute[182725]: 2026-01-22 22:24:14.867 182729 DEBUG nova.network.neutron [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:24:15 compute-0 podman[217413]: 2026-01-22 22:24:15.161638923 +0000 UTC m=+0.081031462 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:24:15 compute-0 podman[217412]: 2026-01-22 22:24:15.164301509 +0000 UTC m=+0.084757154 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.030 182729 DEBUG nova.network.neutron [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Updating instance_info_cache with network_info: [{"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.069 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Releasing lock "refresh_cache-384ca0fc-ad57-4f20-be7c-486868e52dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.069 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Instance network_info: |[{"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.070 182729 DEBUG oslo_concurrency.lockutils [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-384ca0fc-ad57-4f20-be7c-486868e52dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.070 182729 DEBUG nova.network.neutron [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Refreshing network info cache for port 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.074 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Start _get_guest_xml network_info=[{"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.084 182729 WARNING nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.286 182729 DEBUG nova.virt.libvirt.host [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.288 182729 DEBUG nova.virt.libvirt.host [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.306 182729 DEBUG nova.virt.libvirt.host [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.307 182729 DEBUG nova.virt.libvirt.host [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.308 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.309 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.309 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.310 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.310 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.310 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.310 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.311 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.311 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.311 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.311 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.312 182729 DEBUG nova.virt.hardware [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.315 182729 DEBUG nova.virt.libvirt.vif [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-830004643',display_name='tempest-ImagesTestJSON-server-830004643',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-830004643',id=52,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-jed0osuf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:12Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=384ca0fc-ad57-4f20-be7c-486868e52dc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.315 182729 DEBUG nova.network.os_vif_util [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.316 182729 DEBUG nova.network.os_vif_util [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:77,bridge_name='br-int',has_traffic_filtering=True,id=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70f58ea5-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.318 182729 DEBUG nova.objects.instance [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 384ca0fc-ad57-4f20-be7c-486868e52dc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.333 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <uuid>384ca0fc-ad57-4f20-be7c-486868e52dc0</uuid>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <name>instance-00000034</name>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <nova:name>tempest-ImagesTestJSON-server-830004643</nova:name>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:24:16</nova:creationTime>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:user uuid="52d9fe7f0e8b4edf92fa2064aaab8bca">tempest-ImagesTestJSON-23148374-project-member</nova:user>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:project uuid="d3a2ee662fba426c8f688455b20759bf">tempest-ImagesTestJSON-23148374</nova:project>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         <nova:port uuid="70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b">
Jan 22 22:24:16 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <system>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <entry name="serial">384ca0fc-ad57-4f20-be7c-486868e52dc0</entry>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <entry name="uuid">384ca0fc-ad57-4f20-be7c-486868e52dc0</entry>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </system>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <os>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   </os>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <features>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   </features>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk.config"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:de:87:77"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <target dev="tap70f58ea5-01"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/console.log" append="off"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <video>
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </video>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:24:16 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:24:16 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:24:16 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:24:16 compute-0 nova_compute[182725]: </domain>
Jan 22 22:24:16 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.334 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Preparing to wait for external event network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.334 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.335 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.335 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.336 182729 DEBUG nova.virt.libvirt.vif [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-830004643',display_name='tempest-ImagesTestJSON-server-830004643',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-830004643',id=52,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-jed0osuf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:12Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=384ca0fc-ad57-4f20-be7c-486868e52dc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.336 182729 DEBUG nova.network.os_vif_util [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.337 182729 DEBUG nova.network.os_vif_util [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:77,bridge_name='br-int',has_traffic_filtering=True,id=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70f58ea5-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.337 182729 DEBUG os_vif [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:77,bridge_name='br-int',has_traffic_filtering=True,id=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70f58ea5-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.338 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.338 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.338 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.342 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.342 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70f58ea5-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.343 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70f58ea5-01, col_values=(('external_ids', {'iface-id': '70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:87:77', 'vm-uuid': '384ca0fc-ad57-4f20-be7c-486868e52dc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.344 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:16 compute-0 NetworkManager[54954]: <info>  [1769120656.3462] manager: (tap70f58ea5-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.348 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.352 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.353 182729 INFO os_vif [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:77,bridge_name='br-int',has_traffic_filtering=True,id=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70f58ea5-01')
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.497 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.497 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.497 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No VIF found with MAC fa:16:3e:de:87:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.498 182729 INFO nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Using config drive
Jan 22 22:24:16 compute-0 nova_compute[182725]: 2026-01-22 22:24:16.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:17 compute-0 nova_compute[182725]: 2026-01-22 22:24:17.669 182729 INFO nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Creating config drive at /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk.config
Jan 22 22:24:17 compute-0 nova_compute[182725]: 2026-01-22 22:24:17.673 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8k810vva execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:17 compute-0 nova_compute[182725]: 2026-01-22 22:24:17.815 182729 DEBUG oslo_concurrency.processutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8k810vva" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:17 compute-0 nova_compute[182725]: 2026-01-22 22:24:17.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:17 compute-0 kernel: tap70f58ea5-01: entered promiscuous mode
Jan 22 22:24:17 compute-0 NetworkManager[54954]: <info>  [1769120657.8944] manager: (tap70f58ea5-01): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Jan 22 22:24:17 compute-0 ovn_controller[94850]: 2026-01-22T22:24:17Z|00139|binding|INFO|Claiming lport 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b for this chassis.
Jan 22 22:24:17 compute-0 ovn_controller[94850]: 2026-01-22T22:24:17Z|00140|binding|INFO|70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b: Claiming fa:16:3e:de:87:77 10.100.0.8
Jan 22 22:24:17 compute-0 nova_compute[182725]: 2026-01-22 22:24:17.894 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:17 compute-0 ovn_controller[94850]: 2026-01-22T22:24:17Z|00141|binding|INFO|Setting lport 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b ovn-installed in OVS
Jan 22 22:24:17 compute-0 ovn_controller[94850]: 2026-01-22T22:24:17Z|00142|binding|INFO|Setting lport 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b up in Southbound
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.919 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:87:77 10.100.0.8'], port_security=['fa:16:3e:de:87:77 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '384ca0fc-ad57-4f20-be7c-486868e52dc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.920 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 bound to our chassis
Jan 22 22:24:17 compute-0 nova_compute[182725]: 2026-01-22 22:24:17.920 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.921 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 22:24:17 compute-0 nova_compute[182725]: 2026-01-22 22:24:17.923 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:17 compute-0 systemd-udevd[217469]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.937 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bef0c874-5fe9-47ac-a6ee-7723254d1bd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.938 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd5f6392-b1 in ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:24:17 compute-0 NetworkManager[54954]: <info>  [1769120657.9414] device (tap70f58ea5-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.940 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd5f6392-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.941 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2e74088d-9038-4e6a-9337-8a5129f2ce34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:17 compute-0 NetworkManager[54954]: <info>  [1769120657.9423] device (tap70f58ea5-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.942 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bb698402-2cec-4055-861f-8b6b91d74034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:17 compute-0 systemd-machined[154006]: New machine qemu-22-instance-00000034.
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.957 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[96d634d2-701e-4cbb-b0c1-b8f1d5a82192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:17 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-00000034.
Jan 22 22:24:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:17.986 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ce49e6-cc8e-4bcd-80fb-f3ee139fcadb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.019 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[34aa5b1e-059f-4f35-b3ca-c8df6c8de34d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 NetworkManager[54954]: <info>  [1769120658.0275] manager: (tapdd5f6392-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.026 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[492f1ede-7e57-4333-9531-49e9d60f1cc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.067 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c53e050b-a87b-4fd4-ae5f-3868884cf176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.072 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[117884f9-1265-41f8-b8d8-50f8c442d0f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 NetworkManager[54954]: <info>  [1769120658.1038] device (tapdd5f6392-b0): carrier: link connected
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.111 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[587db592-4c30-4233-93c7-0de7bcc41804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.132 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8333af9c-f7b6-4809-b3c2-e6d3a91206b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429570, 'reachable_time': 43778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217505, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.153 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f0bf57-a069-4297-84c9-2a9180c40bb2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:d723'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 429570, 'tstamp': 429570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217506, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.179 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b7dfa15e-b3eb-483c-8186-d9c191f1cc88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429570, 'reachable_time': 43778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217507, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.226 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f08b9985-83a1-47a8-8bfb-8cfeac41ec4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.307 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8185b3a3-be01-41a8-8e2a-eb352df574e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.309 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.309 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.309 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd5f6392-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:18 compute-0 NetworkManager[54954]: <info>  [1769120658.3123] manager: (tapdd5f6392-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 22 22:24:18 compute-0 kernel: tapdd5f6392-b0: entered promiscuous mode
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.311 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.314 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.315 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd5f6392-b0, col_values=(('external_ids', {'iface-id': 'c2b5e191-6c34-4707-83d4-b3c5bc12ff1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.316 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:18 compute-0 ovn_controller[94850]: 2026-01-22T22:24:18Z|00143|binding|INFO|Releasing lport c2b5e191-6c34-4707-83d4-b3c5bc12ff1e from this chassis (sb_readonly=0)
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.318 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.319 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[51fcf6f6-a489-46aa-a9bc-23eba1d4441d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.320 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.318 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:18.322 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'env', 'PROCESS_TAG=haproxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd5f6392-bfb2-42bf-a825-c0516c8891b0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.329 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.493 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120658.492561, 384ca0fc-ad57-4f20-be7c-486868e52dc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.493 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] VM Started (Lifecycle Event)
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.512 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.517 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120658.492763, 384ca0fc-ad57-4f20-be7c-486868e52dc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.517 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] VM Paused (Lifecycle Event)
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.533 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.537 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.553 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.691 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:18 compute-0 podman[217544]: 2026-01-22 22:24:18.80250271 +0000 UTC m=+0.066567925 container create e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:24:18 compute-0 systemd[1]: Started libpod-conmon-e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e.scope.
Jan 22 22:24:18 compute-0 podman[217544]: 2026-01-22 22:24:18.775585665 +0000 UTC m=+0.039650900 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:24:18 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:24:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cec60825f546fc7dd547e4b25a5d3758b5592ac5192d44d950ed3a3a09e2d5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.897 182729 DEBUG nova.compute.manager [req-82834a16-244c-4ea8-8aab-d42d68b27c74 req-851038b3-a459-431e-9fb5-9ec557cdb098 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received event network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.898 182729 DEBUG oslo_concurrency.lockutils [req-82834a16-244c-4ea8-8aab-d42d68b27c74 req-851038b3-a459-431e-9fb5-9ec557cdb098 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.899 182729 DEBUG oslo_concurrency.lockutils [req-82834a16-244c-4ea8-8aab-d42d68b27c74 req-851038b3-a459-431e-9fb5-9ec557cdb098 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.899 182729 DEBUG oslo_concurrency.lockutils [req-82834a16-244c-4ea8-8aab-d42d68b27c74 req-851038b3-a459-431e-9fb5-9ec557cdb098 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:18 compute-0 podman[217544]: 2026-01-22 22:24:18.897701892 +0000 UTC m=+0.161767117 container init e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.899 182729 DEBUG nova.compute.manager [req-82834a16-244c-4ea8-8aab-d42d68b27c74 req-851038b3-a459-431e-9fb5-9ec557cdb098 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Processing event network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.900 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:24:18 compute-0 podman[217544]: 2026-01-22 22:24:18.910279912 +0000 UTC m=+0.174345127 container start e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.910 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120658.9099524, 384ca0fc-ad57-4f20-be7c-486868e52dc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.911 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] VM Resumed (Lifecycle Event)
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.913 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.917 182729 INFO nova.virt.libvirt.driver [-] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Instance spawned successfully.
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.918 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.938 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.944 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.944 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.944 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.945 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.945 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.945 182729 DEBUG nova.virt.libvirt.driver [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.949 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:24:18 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217560]: [NOTICE]   (217575) : New worker (217588) forked
Jan 22 22:24:18 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217560]: [NOTICE]   (217575) : Loading success.
Jan 22 22:24:18 compute-0 podman[217557]: 2026-01-22 22:24:18.953818708 +0000 UTC m=+0.092703761 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:24:18 compute-0 nova_compute[182725]: 2026-01-22 22:24:18.989 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.019 182729 INFO nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Took 6.48 seconds to spawn the instance on the hypervisor.
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.020 182729 DEBUG nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.108 182729 INFO nova.compute.manager [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Took 7.21 seconds to build instance.
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.131 182729 DEBUG oslo_concurrency.lockutils [None req-de75bc63-eb0f-4c22-93eb-1bee2961d804 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.674 182729 DEBUG nova.network.neutron [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Updated VIF entry in instance network info cache for port 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.675 182729 DEBUG nova.network.neutron [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Updating instance_info_cache with network_info: [{"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.699 182729 DEBUG oslo_concurrency.lockutils [req-ef6fd12b-d8b4-4b4d-99af-dc081283b23c req-7d580fbe-e402-4815-821f-51ce50f90b3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-384ca0fc-ad57-4f20-be7c-486868e52dc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:24:19 compute-0 nova_compute[182725]: 2026-01-22 22:24:19.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:21 compute-0 nova_compute[182725]: 2026-01-22 22:24:21.075 182729 DEBUG nova.compute.manager [req-eb124b2d-4539-403d-8325-37df9dcc3455 req-f2443e38-6893-4531-a6e4-c3ed1667e69c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received event network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:21 compute-0 nova_compute[182725]: 2026-01-22 22:24:21.076 182729 DEBUG oslo_concurrency.lockutils [req-eb124b2d-4539-403d-8325-37df9dcc3455 req-f2443e38-6893-4531-a6e4-c3ed1667e69c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:21 compute-0 nova_compute[182725]: 2026-01-22 22:24:21.076 182729 DEBUG oslo_concurrency.lockutils [req-eb124b2d-4539-403d-8325-37df9dcc3455 req-f2443e38-6893-4531-a6e4-c3ed1667e69c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:21 compute-0 nova_compute[182725]: 2026-01-22 22:24:21.076 182729 DEBUG oslo_concurrency.lockutils [req-eb124b2d-4539-403d-8325-37df9dcc3455 req-f2443e38-6893-4531-a6e4-c3ed1667e69c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:21 compute-0 nova_compute[182725]: 2026-01-22 22:24:21.076 182729 DEBUG nova.compute.manager [req-eb124b2d-4539-403d-8325-37df9dcc3455 req-f2443e38-6893-4531-a6e4-c3ed1667e69c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] No waiting events found dispatching network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:21 compute-0 nova_compute[182725]: 2026-01-22 22:24:21.077 182729 WARNING nova.compute.manager [req-eb124b2d-4539-403d-8325-37df9dcc3455 req-f2443e38-6893-4531-a6e4-c3ed1667e69c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received unexpected event network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b for instance with vm_state active and task_state None.
Jan 22 22:24:21 compute-0 nova_compute[182725]: 2026-01-22 22:24:21.346 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:22 compute-0 nova_compute[182725]: 2026-01-22 22:24:22.818 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120647.8167012, 5d4456c4-888d-4a4f-b820-b7eed8f26b8b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:22 compute-0 nova_compute[182725]: 2026-01-22 22:24:22.819 182729 INFO nova.compute.manager [-] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] VM Stopped (Lifecycle Event)
Jan 22 22:24:22 compute-0 nova_compute[182725]: 2026-01-22 22:24:22.845 182729 DEBUG nova.compute.manager [None req-e9968efa-ac08-4b69-b5da-26c42373e31c - - - - - -] [instance: 5d4456c4-888d-4a4f-b820-b7eed8f26b8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.001 182729 DEBUG nova.compute.manager [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.113 182729 INFO nova.compute.manager [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] instance snapshotting
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.493 182729 INFO nova.virt.libvirt.driver [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Beginning live snapshot process
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.693 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:23 compute-0 virtqemud[182297]: invalid argument: disk vda does not have an active block job
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.743 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.815 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk --force-share --output=json -f qcow2" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.817 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.901 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0/disk --force-share --output=json -f qcow2" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:23 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.922 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:23.998 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:24.000 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpimspphih/b8eb395dd87f4af38b68bc360c3594d1.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:24.477 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpimspphih/b8eb395dd87f4af38b68bc360c3594d1.delta 1073741824" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:24.478 182729 INFO nova.virt.libvirt.driver [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:24.556 182729 DEBUG nova.virt.libvirt.guest [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:24.567 182729 INFO nova.virt.libvirt.driver [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:24.906 182729 DEBUG nova.privsep.utils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:24:24 compute-0 nova_compute[182725]: 2026-01-22 22:24:24.907 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpimspphih/b8eb395dd87f4af38b68bc360c3594d1.delta /var/lib/nova/instances/snapshots/tmpimspphih/b8eb395dd87f4af38b68bc360c3594d1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:25 compute-0 nova_compute[182725]: 2026-01-22 22:24:25.533 182729 DEBUG oslo_concurrency.processutils [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpimspphih/b8eb395dd87f4af38b68bc360c3594d1.delta /var/lib/nova/instances/snapshots/tmpimspphih/b8eb395dd87f4af38b68bc360c3594d1" returned: 0 in 0.626s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:25 compute-0 nova_compute[182725]: 2026-01-22 22:24:25.536 182729 INFO nova.virt.libvirt.driver [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Snapshot extracted, beginning image upload
Jan 22 22:24:25 compute-0 nova_compute[182725]: 2026-01-22 22:24:25.914 182729 WARNING nova.compute.manager [None req-1662fe9b-82ef-433f-b930-361a94f33f63 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Image not found during snapshot: nova.exception.ImageNotFound: Image 505ae749-b6e8-4b0e-821f-3bc5b49ca119 could not be found.
Jan 22 22:24:26 compute-0 nova_compute[182725]: 2026-01-22 22:24:26.349 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:26 compute-0 nova_compute[182725]: 2026-01-22 22:24:26.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:24:26 compute-0 nova_compute[182725]: 2026-01-22 22:24:26.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:24:26 compute-0 nova_compute[182725]: 2026-01-22 22:24:26.917 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.486 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.488 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.488 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.489 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.489 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.503 182729 INFO nova.compute.manager [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Terminating instance
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.519 182729 DEBUG nova.compute.manager [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:24:27 compute-0 kernel: tap70f58ea5-01 (unregistering): left promiscuous mode
Jan 22 22:24:27 compute-0 NetworkManager[54954]: <info>  [1769120667.5459] device (tap70f58ea5-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:24:27 compute-0 ovn_controller[94850]: 2026-01-22T22:24:27Z|00144|binding|INFO|Releasing lport 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b from this chassis (sb_readonly=0)
Jan 22 22:24:27 compute-0 ovn_controller[94850]: 2026-01-22T22:24:27Z|00145|binding|INFO|Setting lport 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b down in Southbound
Jan 22 22:24:27 compute-0 ovn_controller[94850]: 2026-01-22T22:24:27Z|00146|binding|INFO|Removing iface tap70f58ea5-01 ovn-installed in OVS
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.561 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:27.567 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:87:77 10.100.0.8'], port_security=['fa:16:3e:de:87:77 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '384ca0fc-ad57-4f20-be7c-486868e52dc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:24:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:27.569 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 unbound from our chassis
Jan 22 22:24:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:27.570 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd5f6392-bfb2-42bf-a825-c0516c8891b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:24:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:27.571 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c12b857f-486d-4122-8f96-facf7e75e28b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:27.572 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace which is not needed anymore
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.593 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:27 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 22 22:24:27 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000034.scope: Consumed 9.253s CPU time.
Jan 22 22:24:27 compute-0 systemd-machined[154006]: Machine qemu-22-instance-00000034 terminated.
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.799 182729 INFO nova.virt.libvirt.driver [-] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Instance destroyed successfully.
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.801 182729 DEBUG nova.objects.instance [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'resources' on Instance uuid 384ca0fc-ad57-4f20-be7c-486868e52dc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.819 182729 DEBUG nova.virt.libvirt.vif [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-830004643',display_name='tempest-ImagesTestJSON-server-830004643',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-830004643',id=52,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-jed0osuf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:25Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=384ca0fc-ad57-4f20-be7c-486868e52dc0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.820 182729 DEBUG nova.network.os_vif_util [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "address": "fa:16:3e:de:87:77", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70f58ea5-01", "ovs_interfaceid": "70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.821 182729 DEBUG nova.network.os_vif_util [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:77,bridge_name='br-int',has_traffic_filtering=True,id=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70f58ea5-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.822 182729 DEBUG os_vif [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:77,bridge_name='br-int',has_traffic_filtering=True,id=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70f58ea5-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.826 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.826 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70f58ea5-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.829 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.832 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.836 182729 INFO os_vif [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:77,bridge_name='br-int',has_traffic_filtering=True,id=70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70f58ea5-01')
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.837 182729 INFO nova.virt.libvirt.driver [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Deleting instance files /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0_del
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.838 182729 INFO nova.virt.libvirt.driver [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Deletion of /var/lib/nova/instances/384ca0fc-ad57-4f20-be7c-486868e52dc0_del complete
Jan 22 22:24:27 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217560]: [NOTICE]   (217575) : haproxy version is 2.8.14-c23fe91
Jan 22 22:24:27 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217560]: [NOTICE]   (217575) : path to executable is /usr/sbin/haproxy
Jan 22 22:24:27 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217560]: [WARNING]  (217575) : Exiting Master process...
Jan 22 22:24:27 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217560]: [ALERT]    (217575) : Current worker (217588) exited with code 143 (Terminated)
Jan 22 22:24:27 compute-0 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[217560]: [WARNING]  (217575) : All workers exited. Exiting... (0)
Jan 22 22:24:27 compute-0 systemd[1]: libpod-e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e.scope: Deactivated successfully.
Jan 22 22:24:27 compute-0 podman[217649]: 2026-01-22 22:24:27.905548629 +0000 UTC m=+0.221581914 container died e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.955 182729 INFO nova.compute.manager [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.956 182729 DEBUG oslo.service.loopingcall [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.956 182729 DEBUG nova.compute.manager [-] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:24:27 compute-0 nova_compute[182725]: 2026-01-22 22:24:27.956 182729 DEBUG nova.network.neutron [-] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:24:28 compute-0 nova_compute[182725]: 2026-01-22 22:24:28.106 182729 DEBUG nova.compute.manager [req-636964dd-7ef9-4af8-a9df-5fb7053a16b7 req-08680c67-b44d-408a-94d9-effc1ca8beae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received event network-vif-unplugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:28 compute-0 nova_compute[182725]: 2026-01-22 22:24:28.106 182729 DEBUG oslo_concurrency.lockutils [req-636964dd-7ef9-4af8-a9df-5fb7053a16b7 req-08680c67-b44d-408a-94d9-effc1ca8beae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:28 compute-0 nova_compute[182725]: 2026-01-22 22:24:28.107 182729 DEBUG oslo_concurrency.lockutils [req-636964dd-7ef9-4af8-a9df-5fb7053a16b7 req-08680c67-b44d-408a-94d9-effc1ca8beae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:28 compute-0 nova_compute[182725]: 2026-01-22 22:24:28.107 182729 DEBUG oslo_concurrency.lockutils [req-636964dd-7ef9-4af8-a9df-5fb7053a16b7 req-08680c67-b44d-408a-94d9-effc1ca8beae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:28 compute-0 nova_compute[182725]: 2026-01-22 22:24:28.107 182729 DEBUG nova.compute.manager [req-636964dd-7ef9-4af8-a9df-5fb7053a16b7 req-08680c67-b44d-408a-94d9-effc1ca8beae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] No waiting events found dispatching network-vif-unplugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:28 compute-0 nova_compute[182725]: 2026-01-22 22:24:28.107 182729 DEBUG nova.compute.manager [req-636964dd-7ef9-4af8-a9df-5fb7053a16b7 req-08680c67-b44d-408a-94d9-effc1ca8beae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received event network-vif-unplugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:24:28 compute-0 nova_compute[182725]: 2026-01-22 22:24:28.696 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e-userdata-shm.mount: Deactivated successfully.
Jan 22 22:24:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-8cec60825f546fc7dd547e4b25a5d3758b5592ac5192d44d950ed3a3a09e2d5f-merged.mount: Deactivated successfully.
Jan 22 22:24:29 compute-0 nova_compute[182725]: 2026-01-22 22:24:29.654 182729 DEBUG nova.network.neutron [-] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:29 compute-0 nova_compute[182725]: 2026-01-22 22:24:29.692 182729 INFO nova.compute.manager [-] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Took 1.74 seconds to deallocate network for instance.
Jan 22 22:24:29 compute-0 nova_compute[182725]: 2026-01-22 22:24:29.784 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:29 compute-0 nova_compute[182725]: 2026-01-22 22:24:29.785 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:29 compute-0 podman[217649]: 2026-01-22 22:24:29.877054884 +0000 UTC m=+2.193088119 container cleanup e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:24:29 compute-0 systemd[1]: libpod-conmon-e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e.scope: Deactivated successfully.
Jan 22 22:24:29 compute-0 nova_compute[182725]: 2026-01-22 22:24:29.997 182729 DEBUG nova.compute.provider_tree [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.104 182729 DEBUG nova.scheduler.client.report [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.149 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.180 182729 INFO nova.scheduler.client.report [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Deleted allocations for instance 384ca0fc-ad57-4f20-be7c-486868e52dc0
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.259 182729 DEBUG nova.compute.manager [req-ce281d0b-7e47-476a-917d-f8bafc5a7887 req-85b615eb-995f-432c-accf-b39e0bda3c28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received event network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.260 182729 DEBUG oslo_concurrency.lockutils [req-ce281d0b-7e47-476a-917d-f8bafc5a7887 req-85b615eb-995f-432c-accf-b39e0bda3c28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.260 182729 DEBUG oslo_concurrency.lockutils [req-ce281d0b-7e47-476a-917d-f8bafc5a7887 req-85b615eb-995f-432c-accf-b39e0bda3c28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.260 182729 DEBUG oslo_concurrency.lockutils [req-ce281d0b-7e47-476a-917d-f8bafc5a7887 req-85b615eb-995f-432c-accf-b39e0bda3c28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.261 182729 DEBUG nova.compute.manager [req-ce281d0b-7e47-476a-917d-f8bafc5a7887 req-85b615eb-995f-432c-accf-b39e0bda3c28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] No waiting events found dispatching network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.261 182729 WARNING nova.compute.manager [req-ce281d0b-7e47-476a-917d-f8bafc5a7887 req-85b615eb-995f-432c-accf-b39e0bda3c28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received unexpected event network-vif-plugged-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b for instance with vm_state deleted and task_state None.
Jan 22 22:24:30 compute-0 nova_compute[182725]: 2026-01-22 22:24:30.271 182729 DEBUG oslo_concurrency.lockutils [None req-d030441b-e46c-4a75-9c70-7c00a6f2ac64 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "384ca0fc-ad57-4f20-be7c-486868e52dc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:31 compute-0 podman[217699]: 2026-01-22 22:24:31.016676531 +0000 UTC m=+1.099059037 container remove e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.024 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3209ee1c-2135-4b61-91c7-4a1a85b4d8d2]: (4, ('Thu Jan 22 10:24:27 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e)\ne6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e\nThu Jan 22 10:24:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (e6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e)\ne6de35628b3335e229a3746297a6d6b20c0b1eb20cb250bdc524ad737490261e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.027 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[37a1ebae-de6a-4438-a31e-906384d26ccd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.029 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:31 compute-0 nova_compute[182725]: 2026-01-22 22:24:31.032 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:31 compute-0 kernel: tapdd5f6392-b0: left promiscuous mode
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.039 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[980dbd2b-94fe-4f3b-a6e8-0d04d17f4fa1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:31 compute-0 nova_compute[182725]: 2026-01-22 22:24:31.046 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.054 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5527b2f-0881-4a1a-80a7-dcf23bd9d25c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.057 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f55ff106-e185-4d48-8dce-d2db25891a63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.079 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[45626742-7ac8-4316-a490-89edf613fa2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429561, 'reachable_time': 35974, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217713, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:31 compute-0 systemd[1]: run-netns-ovnmeta\x2ddd5f6392\x2dbfb2\x2d42bf\x2da825\x2dc0516c8891b0.mount: Deactivated successfully.
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.084 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:24:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:31.084 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[14f392da-dfea-4d0c-bbbd-2c6a96840658]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:31 compute-0 nova_compute[182725]: 2026-01-22 22:24:31.675 182729 DEBUG nova.compute.manager [req-5402acd2-5d7e-48da-b82d-dac8a641d034 req-692ecdb7-acd7-4a1e-8ad9-04180ecea46b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Received event network-vif-deleted-70f58ea5-01d9-4406-8d2b-4bc22f2c3d1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:32 compute-0 nova_compute[182725]: 2026-01-22 22:24:32.830 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:33 compute-0 podman[217718]: 2026-01-22 22:24:33.158678286 +0000 UTC m=+0.087483332 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:24:33 compute-0 nova_compute[182725]: 2026-01-22 22:24:33.699 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:35 compute-0 podman[217738]: 2026-01-22 22:24:35.17023514 +0000 UTC m=+0.104548373 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:24:35 compute-0 podman[217739]: 2026-01-22 22:24:35.179087879 +0000 UTC m=+0.100781570 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, vcs-type=git, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 22 22:24:37 compute-0 nova_compute[182725]: 2026-01-22 22:24:37.834 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:38 compute-0 nova_compute[182725]: 2026-01-22 22:24:38.700 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.043 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.043 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.070 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.102 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.187 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.188 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.222 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.228 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.228 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.239 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.240 182729 INFO nova.compute.claims [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.737 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.839 182729 DEBUG nova.compute.provider_tree [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.867 182729 DEBUG nova.scheduler.client.report [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.909 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.910 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.913 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.922 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.922 182729 INFO nova.compute.claims [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.985 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:24:41 compute-0 nova_compute[182725]: 2026-01-22 22:24:41.985 182729 DEBUG nova.network.neutron [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.024 182729 INFO nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.060 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.102 182729 DEBUG nova.compute.provider_tree [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.118 182729 DEBUG nova.scheduler.client.report [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.160 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.161 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.213 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.216 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.216 182729 INFO nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Creating image(s)
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.217 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "/var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.217 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.218 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.238 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.239 182729 DEBUG nova.network.neutron [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.241 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.274 182729 INFO nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.296 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.301 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.302 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.303 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.317 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.383 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.385 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.421 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.425 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.425 182729 INFO nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Creating image(s)
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.427 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "/var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.427 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "/var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.429 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "/var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.451 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk 1073741824" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.452 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.453 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.477 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.515 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.516 182729 DEBUG nova.virt.disk.api [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Checking if we can resize image /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.517 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.579 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.581 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.582 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.597 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.621 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.623 182729 DEBUG nova.virt.disk.api [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Cannot resize image /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.623 182729 DEBUG nova.objects.instance [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'migration_context' on Instance uuid 646e57e9-5637-42ce-b4f6-c1c5603f1e0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.663 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.664 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.705 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.706 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Ensure instance console log exists: /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.707 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.708 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.708 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.711 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.711 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.712 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.793 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.795 182729 DEBUG nova.virt.disk.api [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Checking if we can resize image /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.795 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.825 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120667.7976143, 384ca0fc-ad57-4f20-be7c-486868e52dc0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.827 182729 INFO nova.compute.manager [-] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] VM Stopped (Lifecycle Event)
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.839 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.898 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.899 182729 DEBUG nova.virt.disk.api [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Cannot resize image /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.899 182729 DEBUG nova.objects.instance [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lazy-loading 'migration_context' on Instance uuid 2c72a954-328b-49cc-a01f-afd216318d5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:42 compute-0 nova_compute[182725]: 2026-01-22 22:24:42.980 182729 DEBUG nova.compute.manager [None req-c4779348-7101-413f-a3af-1ccc9c79f223 - - - - - -] [instance: 384ca0fc-ad57-4f20-be7c-486868e52dc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.370 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.371 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Ensure instance console log exists: /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.372 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.372 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.372 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.407 182729 DEBUG nova.policy [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60428d874c4b471889bd0e3c182d3b9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6c3fca4180814795922e49898f54c932', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.651 182729 DEBUG nova.policy [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95cf9999380d48108a561554c1897f15', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:24:43 compute-0 nova_compute[182725]: 2026-01-22 22:24:43.704 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:44.342 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:24:44 compute-0 nova_compute[182725]: 2026-01-22 22:24:44.345 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:44.346 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:24:44 compute-0 nova_compute[182725]: 2026-01-22 22:24:44.446 182729 DEBUG nova.network.neutron [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Successfully created port: 7c16dfb4-1f91-44d6-9c64-9b08ba223050 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:24:44 compute-0 nova_compute[182725]: 2026-01-22 22:24:44.679 182729 DEBUG nova.network.neutron [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Successfully created port: ca32495f-1eae-4a8d-98e8-46d372d9ac5d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.711 182729 DEBUG nova.network.neutron [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Successfully updated port: 7c16dfb4-1f91-44d6-9c64-9b08ba223050 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.724 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.724 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquired lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.724 182729 DEBUG nova.network.neutron [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.731 182729 DEBUG nova.network.neutron [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Successfully updated port: ca32495f-1eae-4a8d-98e8-46d372d9ac5d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.745 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-646e57e9-5637-42ce-b4f6-c1c5603f1e0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.745 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-646e57e9-5637-42ce-b4f6-c1c5603f1e0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.746 182729 DEBUG nova.network.neutron [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.863 182729 DEBUG nova.compute.manager [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received event network-changed-ca32495f-1eae-4a8d-98e8-46d372d9ac5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.863 182729 DEBUG nova.compute.manager [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Refreshing instance network info cache due to event network-changed-ca32495f-1eae-4a8d-98e8-46d372d9ac5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.864 182729 DEBUG oslo_concurrency.lockutils [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-646e57e9-5637-42ce-b4f6-c1c5603f1e0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.929 182729 DEBUG nova.network.neutron [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:24:45 compute-0 nova_compute[182725]: 2026-01-22 22:24:45.951 182729 DEBUG nova.network.neutron [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:24:46 compute-0 nova_compute[182725]: 2026-01-22 22:24:46.014 182729 DEBUG nova.compute.manager [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-changed-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:46 compute-0 nova_compute[182725]: 2026-01-22 22:24:46.014 182729 DEBUG nova.compute.manager [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Refreshing instance network info cache due to event network-changed-7c16dfb4-1f91-44d6-9c64-9b08ba223050. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:24:46 compute-0 nova_compute[182725]: 2026-01-22 22:24:46.015 182729 DEBUG oslo_concurrency.lockutils [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:24:46 compute-0 podman[217815]: 2026-01-22 22:24:46.129043423 +0000 UTC m=+0.060187058 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:24:46 compute-0 podman[217814]: 2026-01-22 22:24:46.13014545 +0000 UTC m=+0.064232247 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 22:24:46 compute-0 nova_compute[182725]: 2026-01-22 22:24:46.987 182729 DEBUG nova.network.neutron [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Updating instance_info_cache with network_info: [{"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.030 182729 DEBUG nova.network.neutron [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updating instance_info_cache with network_info: [{"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.075 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-646e57e9-5637-42ce-b4f6-c1c5603f1e0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.076 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Instance network_info: |[{"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.077 182729 DEBUG oslo_concurrency.lockutils [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-646e57e9-5637-42ce-b4f6-c1c5603f1e0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.077 182729 DEBUG nova.network.neutron [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Refreshing network info cache for port ca32495f-1eae-4a8d-98e8-46d372d9ac5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.080 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Start _get_guest_xml network_info=[{"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.083 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Releasing lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.083 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Instance network_info: |[{"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.084 182729 DEBUG oslo_concurrency.lockutils [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.084 182729 DEBUG nova.network.neutron [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Refreshing network info cache for port 7c16dfb4-1f91-44d6-9c64-9b08ba223050 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.088 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Start _get_guest_xml network_info=[{"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.103 182729 WARNING nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.107 182729 WARNING nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.112 182729 DEBUG nova.virt.libvirt.host [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.112 182729 DEBUG nova.virt.libvirt.host [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.113 182729 DEBUG nova.virt.libvirt.host [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.113 182729 DEBUG nova.virt.libvirt.host [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.116 182729 DEBUG nova.virt.libvirt.host [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.117 182729 DEBUG nova.virt.libvirt.host [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.118 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.118 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.119 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.119 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.119 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.120 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.120 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.120 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.120 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.120 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.121 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.121 182729 DEBUG nova.virt.hardware [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.125 182729 DEBUG nova.virt.libvirt.vif [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-931857640',display_name='tempest-DeleteServersTestJSON-server-931857640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-931857640',id=54,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-4w99ybxd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:42Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=646e57e9-5637-42ce-b4f6-c1c5603f1e0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.125 182729 DEBUG nova.network.os_vif_util [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.126 182729 DEBUG nova.network.os_vif_util [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:98:52,bridge_name='br-int',has_traffic_filtering=True,id=ca32495f-1eae-4a8d-98e8-46d372d9ac5d,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca32495f-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.127 182729 DEBUG nova.objects.instance [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 646e57e9-5637-42ce-b4f6-c1c5603f1e0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.129 182729 DEBUG nova.virt.libvirt.host [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.129 182729 DEBUG nova.virt.libvirt.host [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.130 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.130 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.131 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.131 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.131 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.131 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.132 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.132 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.132 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.132 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.133 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.133 182729 DEBUG nova.virt.hardware [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.136 182729 DEBUG nova.virt.libvirt.vif [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-263288411',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-263288411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-263288411',id=55,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c3fca4180814795922e49898f54c932',ramdisk_id='',reservation_id='r-6nf41k0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1086027111',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1086027111-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:42Z,user_data=None,user_id='60428d874c4b471889bd0e3c182d3b9a',uuid=2c72a954-328b-49cc-a01f-afd216318d5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.137 182729 DEBUG nova.network.os_vif_util [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Converting VIF {"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.138 182729 DEBUG nova.network.os_vif_util [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:f6:68,bridge_name='br-int',has_traffic_filtering=True,id=7c16dfb4-1f91-44d6-9c64-9b08ba223050,network=Network(effa305d-5c76-462b-aabf-288608d21c44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c16dfb4-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.138 182729 DEBUG nova.objects.instance [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2c72a954-328b-49cc-a01f-afd216318d5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.163 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <uuid>646e57e9-5637-42ce-b4f6-c1c5603f1e0e</uuid>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <name>instance-00000036</name>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:name>tempest-DeleteServersTestJSON-server-931857640</nova:name>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:24:47</nova:creationTime>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:user uuid="95cf9999380d48108a561554c1897f15">tempest-DeleteServersTestJSON-1655437746-project-member</nova:user>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:project uuid="9f8f780ce45a4950b1666a54cd9a5ba0">tempest-DeleteServersTestJSON-1655437746</nova:project>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:port uuid="ca32495f-1eae-4a8d-98e8-46d372d9ac5d">
Jan 22 22:24:47 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <system>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="serial">646e57e9-5637-42ce-b4f6-c1c5603f1e0e</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="uuid">646e57e9-5637-42ce-b4f6-c1c5603f1e0e</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </system>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <os>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </os>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <features>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </features>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk.config"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:8b:98:52"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <target dev="tapca32495f-1e"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/console.log" append="off"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <video>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </video>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:24:47 compute-0 nova_compute[182725]: </domain>
Jan 22 22:24:47 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.164 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Preparing to wait for external event network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.165 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.165 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.165 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.166 182729 DEBUG nova.virt.libvirt.vif [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-931857640',display_name='tempest-DeleteServersTestJSON-server-931857640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-931857640',id=54,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-4w99ybxd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:42Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=646e57e9-5637-42ce-b4f6-c1c5603f1e0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.166 182729 DEBUG nova.network.os_vif_util [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.167 182729 DEBUG nova.network.os_vif_util [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:98:52,bridge_name='br-int',has_traffic_filtering=True,id=ca32495f-1eae-4a8d-98e8-46d372d9ac5d,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca32495f-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.168 182729 DEBUG os_vif [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:98:52,bridge_name='br-int',has_traffic_filtering=True,id=ca32495f-1eae-4a8d-98e8-46d372d9ac5d,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca32495f-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.168 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.169 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.169 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.175 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.175 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca32495f-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.176 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapca32495f-1e, col_values=(('external_ids', {'iface-id': 'ca32495f-1eae-4a8d-98e8-46d372d9ac5d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:98:52', 'vm-uuid': '646e57e9-5637-42ce-b4f6-c1c5603f1e0e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.178 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.180 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:24:47 compute-0 NetworkManager[54954]: <info>  [1769120687.1797] manager: (tapca32495f-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.185 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.186 182729 INFO os_vif [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:98:52,bridge_name='br-int',has_traffic_filtering=True,id=ca32495f-1eae-4a8d-98e8-46d372d9ac5d,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca32495f-1e')
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.241 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <uuid>2c72a954-328b-49cc-a01f-afd216318d5a</uuid>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <name>instance-00000037</name>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-263288411</nova:name>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:24:47</nova:creationTime>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:user uuid="60428d874c4b471889bd0e3c182d3b9a">tempest-FloatingIPsAssociationNegativeTestJSON-1086027111-project-member</nova:user>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:project uuid="6c3fca4180814795922e49898f54c932">tempest-FloatingIPsAssociationNegativeTestJSON-1086027111</nova:project>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         <nova:port uuid="7c16dfb4-1f91-44d6-9c64-9b08ba223050">
Jan 22 22:24:47 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <system>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="serial">2c72a954-328b-49cc-a01f-afd216318d5a</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="uuid">2c72a954-328b-49cc-a01f-afd216318d5a</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </system>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <os>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </os>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <features>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </features>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk.config"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:b6:f6:68"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <target dev="tap7c16dfb4-1f"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/console.log" append="off"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <video>
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </video>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:24:47 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:24:47 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:24:47 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:24:47 compute-0 nova_compute[182725]: </domain>
Jan 22 22:24:47 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.242 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Preparing to wait for external event network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.243 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.243 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.244 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.244 182729 DEBUG nova.virt.libvirt.vif [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-263288411',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-263288411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-263288411',id=55,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c3fca4180814795922e49898f54c932',ramdisk_id='',reservation_id='r-6nf41k0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1086027111',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1086027111-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:42Z,user_data=None,user_id='60428d874c4b471889bd0e3c182d3b9a',uuid=2c72a954-328b-49cc-a01f-afd216318d5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.244 182729 DEBUG nova.network.os_vif_util [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Converting VIF {"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.245 182729 DEBUG nova.network.os_vif_util [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:f6:68,bridge_name='br-int',has_traffic_filtering=True,id=7c16dfb4-1f91-44d6-9c64-9b08ba223050,network=Network(effa305d-5c76-462b-aabf-288608d21c44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c16dfb4-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.245 182729 DEBUG os_vif [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:f6:68,bridge_name='br-int',has_traffic_filtering=True,id=7c16dfb4-1f91-44d6-9c64-9b08ba223050,network=Network(effa305d-5c76-462b-aabf-288608d21c44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c16dfb4-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.246 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.246 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.246 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.249 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.250 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c16dfb4-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.250 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c16dfb4-1f, col_values=(('external_ids', {'iface-id': '7c16dfb4-1f91-44d6-9c64-9b08ba223050', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:f6:68', 'vm-uuid': '2c72a954-328b-49cc-a01f-afd216318d5a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.251 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.253 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 NetworkManager[54954]: <info>  [1769120687.2538] manager: (tap7c16dfb4-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.260 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.261 182729 INFO os_vif [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:f6:68,bridge_name='br-int',has_traffic_filtering=True,id=7c16dfb4-1f91-44d6-9c64-9b08ba223050,network=Network(effa305d-5c76-462b-aabf-288608d21c44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c16dfb4-1f')
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.293 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.294 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.294 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No VIF found with MAC fa:16:3e:8b:98:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.294 182729 INFO nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Using config drive
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.343 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.343 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.343 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] No VIF found with MAC fa:16:3e:b6:f6:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.344 182729 INFO nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Using config drive
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.797 182729 INFO nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Creating config drive at /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk.config
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.804 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfzu4kw36 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.842 182729 INFO nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Creating config drive at /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk.config
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.853 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbpjjvczk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.943 182729 DEBUG oslo_concurrency.processutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfzu4kw36" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:47 compute-0 nova_compute[182725]: 2026-01-22 22:24:47.989 182729 DEBUG oslo_concurrency.processutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbpjjvczk" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.0021] manager: (tapca32495f-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Jan 22 22:24:48 compute-0 kernel: tapca32495f-1e: entered promiscuous mode
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00147|binding|INFO|Claiming lport ca32495f-1eae-4a8d-98e8-46d372d9ac5d for this chassis.
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00148|binding|INFO|ca32495f-1eae-4a8d-98e8-46d372d9ac5d: Claiming fa:16:3e:8b:98:52 10.100.0.14
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.008 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.013 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.022 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:98:52 10.100.0.14'], port_security=['fa:16:3e:8b:98:52 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '646e57e9-5637-42ce-b4f6-c1c5603f1e0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=ca32495f-1eae-4a8d-98e8-46d372d9ac5d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.024 104215 INFO neutron.agent.ovn.metadata.agent [-] Port ca32495f-1eae-4a8d-98e8-46d372d9ac5d in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad bound to our chassis
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.026 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.043 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7fb317-1e36-40f8-8282-aaeeb22b2762]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.044 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap976277ea-61 in ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:24:48 compute-0 systemd-udevd[217886]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.046 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap976277ea-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.046 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2641d777-dd50-4b1c-8a21-c5f8f596d982]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.047 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0266e76b-7d4f-40bd-af2e-44b35f3fc934]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.061 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[f50294b7-813c-40a6-8444-0a45a7b44e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.0655] device (tapca32495f-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.0671] device (tapca32495f-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.069 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 systemd-machined[154006]: New machine qemu-23-instance-00000036.
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00149|binding|INFO|Setting lport ca32495f-1eae-4a8d-98e8-46d372d9ac5d ovn-installed in OVS
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00150|binding|INFO|Setting lport ca32495f-1eae-4a8d-98e8-46d372d9ac5d up in Southbound
Jan 22 22:24:48 compute-0 kernel: tap7c16dfb4-1f: entered promiscuous mode
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.0809] manager: (tap7c16dfb4-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.082 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00151|if_status|INFO|Not updating pb chassis for 7c16dfb4-1f91-44d6-9c64-9b08ba223050 now as sb is readonly
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00152|binding|INFO|Claiming lport 7c16dfb4-1f91-44d6-9c64-9b08ba223050 for this chassis.
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00153|binding|INFO|7c16dfb4-1f91-44d6-9c64-9b08ba223050: Claiming fa:16:3e:b6:f6:68 10.100.0.8
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.089 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-00000036.
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.0925] device (tap7c16dfb4-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.091 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d01615b3-aa98-4419-9455-a71267380deb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.0935] device (tap7c16dfb4-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.108 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:f6:68 10.100.0.8'], port_security=['fa:16:3e:b6:f6:68 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2c72a954-328b-49cc-a01f-afd216318d5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-effa305d-5c76-462b-aabf-288608d21c44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3fca4180814795922e49898f54c932', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0b02bd21-6139-4053-849a-5d4cf68f4b67', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=096a816c-fd03-4c8c-abf2-09541493bfda, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=7c16dfb4-1f91-44d6-9c64-9b08ba223050) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.120 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a6573326-c02e-4695-b49a-441b86f885b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.139 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[362e1dc0-e832-4e13-8f72-cd793b327419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.1411] manager: (tap976277ea-60): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Jan 22 22:24:48 compute-0 systemd-machined[154006]: New machine qemu-24-instance-00000037.
Jan 22 22:24:48 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000037.
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00154|binding|INFO|Setting lport 7c16dfb4-1f91-44d6-9c64-9b08ba223050 ovn-installed in OVS
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00155|binding|INFO|Setting lport 7c16dfb4-1f91-44d6-9c64-9b08ba223050 up in Southbound
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.148 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.182 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[97d08bb0-cd10-4c38-abe0-a34558599c79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.185 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf76bf6-3c13-42ea-aa29-c27294f3fe69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.2111] device (tap976277ea-60): carrier: link connected
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.220 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f75d813a-e6c0-4fff-9e41-de46af15741f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.238 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bf789afb-dde4-4574-89fe-3f5e5f6f15dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432581, 'reachable_time': 30860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217938, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.253 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[29f0d8f4-7179-4c72-bcef-517ce07a6dae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:95d6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432581, 'tstamp': 432581}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217939, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.271 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4319590c-0f44-44a1-b14d-005d8ed1439f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432581, 'reachable_time': 30860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217940, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.309 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cd526e3a-c57f-4f7e-a233-e5e1715f023c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.387 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e3840ca4-b85e-4343-b241-55217797265a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.388 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.389 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.389 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap976277ea-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:48 compute-0 kernel: tap976277ea-60: entered promiscuous mode
Jan 22 22:24:48 compute-0 NetworkManager[54954]: <info>  [1769120688.3948] manager: (tap976277ea-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.391 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.393 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.397 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap976277ea-60, col_values=(('external_ids', {'iface-id': '06db452d-91a0-4ebb-b584-a57953634a03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.399 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 ovn_controller[94850]: 2026-01-22T22:24:48Z|00156|binding|INFO|Releasing lport 06db452d-91a0-4ebb-b584-a57953634a03 from this chassis (sb_readonly=0)
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.400 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.402 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.403 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8b689c80-e337-4be3-a0b0-008d9362450a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.404 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:24:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:48.404 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'env', 'PROCESS_TAG=haproxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/976277ea-61b2-4223-a8f7-3d46bf9c98ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.411 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.422 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120688.421508, 2c72a954-328b-49cc-a01f-afd216318d5a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.422 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] VM Started (Lifecycle Event)
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.440 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.444 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120688.4217582, 2c72a954-328b-49cc-a01f-afd216318d5a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.445 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] VM Paused (Lifecycle Event)
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.519 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.524 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.624 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:24:48 compute-0 nova_compute[182725]: 2026-01-22 22:24:48.706 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:48 compute-0 podman[217978]: 2026-01-22 22:24:48.858603721 +0000 UTC m=+0.117862012 container create 83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:24:48 compute-0 podman[217978]: 2026-01-22 22:24:48.764158258 +0000 UTC m=+0.023416549 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:24:48 compute-0 systemd[1]: Started libpod-conmon-83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d.scope.
Jan 22 22:24:48 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:24:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae4db02b95e06bba98d491fa064d101707120200313b734f224f5945f22a8d7a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.220 182729 DEBUG nova.compute.manager [req-26af9032-fafd-4070-9e10-19b0007be9ae req-57581c36-1ade-4b4b-9cbc-c98422964379 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received event network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.222 182729 DEBUG oslo_concurrency.lockutils [req-26af9032-fafd-4070-9e10-19b0007be9ae req-57581c36-1ade-4b4b-9cbc-c98422964379 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.223 182729 DEBUG oslo_concurrency.lockutils [req-26af9032-fafd-4070-9e10-19b0007be9ae req-57581c36-1ade-4b4b-9cbc-c98422964379 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.223 182729 DEBUG oslo_concurrency.lockutils [req-26af9032-fafd-4070-9e10-19b0007be9ae req-57581c36-1ade-4b4b-9cbc-c98422964379 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.224 182729 DEBUG nova.compute.manager [req-26af9032-fafd-4070-9e10-19b0007be9ae req-57581c36-1ade-4b4b-9cbc-c98422964379 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Processing event network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.234 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120689.2313216, 646e57e9-5637-42ce-b4f6-c1c5603f1e0e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.235 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] VM Started (Lifecycle Event)
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.239 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.249 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.255 182729 INFO nova.virt.libvirt.driver [-] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Instance spawned successfully.
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.256 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.265 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.271 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.283 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.283 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.284 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.285 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.286 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.288 182729 DEBUG nova.virt.libvirt.driver [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.295 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.296 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120689.2351494, 646e57e9-5637-42ce-b4f6-c1c5603f1e0e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.296 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] VM Paused (Lifecycle Event)
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.323 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.328 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120689.2477515, 646e57e9-5637-42ce-b4f6-c1c5603f1e0e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.328 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] VM Resumed (Lifecycle Event)
Jan 22 22:24:49 compute-0 podman[217978]: 2026-01-22 22:24:49.338165846 +0000 UTC m=+0.597424227 container init 83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 22:24:49 compute-0 podman[217978]: 2026-01-22 22:24:49.350758007 +0000 UTC m=+0.610016328 container start 83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.353 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.358 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.387 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:24:49 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [NOTICE]   (218015) : New worker (218017) forked
Jan 22 22:24:49 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [NOTICE]   (218015) : Loading success.
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.396 182729 INFO nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Took 7.18 seconds to spawn the instance on the hypervisor.
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.396 182729 DEBUG nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.505 182729 INFO nova.compute.manager [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Took 8.35 seconds to build instance.
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.506 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 7c16dfb4-1f91-44d6-9c64-9b08ba223050 in datapath effa305d-5c76-462b-aabf-288608d21c44 unbound from our chassis
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.511 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network effa305d-5c76-462b-aabf-288608d21c44
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.531 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e636a19b-d99a-4f3e-ac2b-d707e4a1ebe7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.533 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeffa305d-51 in ovnmeta-effa305d-5c76-462b-aabf-288608d21c44 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.538 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeffa305d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.538 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d4339957-ca06-44f0-b509-ebb84a4701dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.540 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a00998-c005-4ba4-a295-5d46946658d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.542 182729 DEBUG oslo_concurrency.lockutils [None req-4e7408c7-74a6-4174-ab4d-21c74a20cc1f 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.499s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.556 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca54f3e-2e25-4cb9-88a9-a903ccdffd13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 podman[217996]: 2026-01-22 22:24:49.563631455 +0000 UTC m=+0.492533936 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.575 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[09b420cb-ca5b-4a1a-8b40-b7bf1a14bf9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.583 182729 DEBUG nova.network.neutron [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updated VIF entry in instance network info cache for port 7c16dfb4-1f91-44d6-9c64-9b08ba223050. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.586 182729 DEBUG nova.network.neutron [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updating instance_info_cache with network_info: [{"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.604 182729 DEBUG oslo_concurrency.lockutils [req-d142a389-71fe-4e31-a85d-740936766926 req-e66eea67-25d5-4690-93e1-572e36764104 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.630 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[16ac13db-787a-4102-a50f-b6bc4c656ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.637 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[903ab97a-fc3d-48d3-93df-75151f3fe6e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 NetworkManager[54954]: <info>  [1769120689.6398] manager: (tapeffa305d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Jan 22 22:24:49 compute-0 systemd-udevd[217920]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.683 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[442900bd-b439-48d9-8ca3-9d0270bd0937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.690 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d7aa87-fd31-462d-96ab-f24c6526dc37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 NetworkManager[54954]: <info>  [1769120689.7206] device (tapeffa305d-50): carrier: link connected
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.729 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[36029114-6818-456f-8281-4e52019df52b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.750 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b6a041-8a5c-4759-8a21-7320ef7ba287]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeffa305d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:c3:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432732, 'reachable_time': 26845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218050, 'error': None, 'target': 'ovnmeta-effa305d-5c76-462b-aabf-288608d21c44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.778 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[233eebf3-3130-4005-94ba-3959959310d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:c385'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432732, 'tstamp': 432732}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218051, 'error': None, 'target': 'ovnmeta-effa305d-5c76-462b-aabf-288608d21c44', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.803 182729 DEBUG nova.network.neutron [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Updated VIF entry in instance network info cache for port ca32495f-1eae-4a8d-98e8-46d372d9ac5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.803 182729 DEBUG nova.network.neutron [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Updating instance_info_cache with network_info: [{"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.807 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[86fb77f0-d5b0-4ea5-8a1f-ddaf2c164c87]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeffa305d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:c3:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432732, 'reachable_time': 26845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218052, 'error': None, 'target': 'ovnmeta-effa305d-5c76-462b-aabf-288608d21c44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.823 182729 DEBUG oslo_concurrency.lockutils [req-8d12396f-6068-4fc3-a97b-4dad4630242b req-1a8876fb-b105-4062-b666-101da90df460 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-646e57e9-5637-42ce-b4f6-c1c5603f1e0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.866 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e6635577-ed3c-4aa8-a0f9-5c598660f5cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.956 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e600b3b4-9f2e-4c53-a039-7f1cf20fcf08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.959 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeffa305d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.959 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.960 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeffa305d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.962 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:49 compute-0 NetworkManager[54954]: <info>  [1769120689.9637] manager: (tapeffa305d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 22 22:24:49 compute-0 kernel: tapeffa305d-50: entered promiscuous mode
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.967 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.968 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeffa305d-50, col_values=(('external_ids', {'iface-id': '6d5c8dc7-0a4c-4cae-99df-8a301cedfb8f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.969 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:49 compute-0 ovn_controller[94850]: 2026-01-22T22:24:49Z|00157|binding|INFO|Releasing lport 6d5c8dc7-0a4c-4cae-99df-8a301cedfb8f from this chassis (sb_readonly=0)
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.970 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.971 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/effa305d-5c76-462b-aabf-288608d21c44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/effa305d-5c76-462b-aabf-288608d21c44.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.980 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[710ab68f-b297-4e1c-aee3-2b82c2fbc81d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:49 compute-0 nova_compute[182725]: 2026-01-22 22:24:49.981 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.982 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-effa305d-5c76-462b-aabf-288608d21c44
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/effa305d-5c76-462b-aabf-288608d21c44.pid.haproxy
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID effa305d-5c76-462b-aabf-288608d21c44
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:24:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:49.984 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-effa305d-5c76-462b-aabf-288608d21c44', 'env', 'PROCESS_TAG=haproxy-effa305d-5c76-462b-aabf-288608d21c44', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/effa305d-5c76-462b-aabf-288608d21c44.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:24:50 compute-0 podman[218085]: 2026-01-22 22:24:50.395968452 +0000 UTC m=+0.044130641 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:24:50 compute-0 podman[218085]: 2026-01-22 22:24:50.715241618 +0000 UTC m=+0.363403817 container create 76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:24:51 compute-0 systemd[1]: Started libpod-conmon-76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47.scope.
Jan 22 22:24:51 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:24:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b2d1d824682a12c47db1c5660e684924b8ef494d4ceea6384dac9b1ea7b9bae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.356 182729 DEBUG nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received event network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.357 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.357 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.357 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.357 182729 DEBUG nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] No waiting events found dispatching network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.357 182729 WARNING nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received unexpected event network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d for instance with vm_state active and task_state None.
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.358 182729 DEBUG nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.358 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.358 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.358 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.358 182729 DEBUG nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Processing event network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.359 182729 DEBUG nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.359 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.359 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.359 182729 DEBUG oslo_concurrency.lockutils [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.359 182729 DEBUG nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] No waiting events found dispatching network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.359 182729 WARNING nova.compute.manager [req-aef07626-d785-40e6-9b8a-9748dec33eed req-29e48d41-70f3-4c76-98bc-51ee29fa14fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received unexpected event network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 for instance with vm_state building and task_state spawning.
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.360 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.366 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120691.3661473, 2c72a954-328b-49cc-a01f-afd216318d5a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.366 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] VM Resumed (Lifecycle Event)
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.370 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.374 182729 INFO nova.virt.libvirt.driver [-] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Instance spawned successfully.
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.374 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.406 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.409 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.409 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.409 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.410 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.410 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.410 182729 DEBUG nova.virt.libvirt.driver [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.414 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.445 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:24:51 compute-0 podman[218085]: 2026-01-22 22:24:51.624541957 +0000 UTC m=+1.272704136 container init 76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 22:24:51 compute-0 podman[218085]: 2026-01-22 22:24:51.635826226 +0000 UTC m=+1.283988385 container start 76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 22:24:51 compute-0 neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44[218099]: [NOTICE]   (218103) : New worker (218105) forked
Jan 22 22:24:51 compute-0 neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44[218099]: [NOTICE]   (218103) : Loading success.
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.811 182729 INFO nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Took 9.39 seconds to spawn the instance on the hypervisor.
Jan 22 22:24:51 compute-0 nova_compute[182725]: 2026-01-22 22:24:51.811 182729 DEBUG nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.025 182729 INFO nova.compute.manager [None req-f2fc82b0-3050-4ec5-96be-88924d427e3a 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Pausing
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.026 182729 DEBUG nova.objects.instance [None req-f2fc82b0-3050-4ec5-96be-88924d427e3a 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'flavor' on Instance uuid 646e57e9-5637-42ce-b4f6-c1c5603f1e0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.033 182729 INFO nova.compute.manager [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Took 10.55 seconds to build instance.
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.089 182729 DEBUG oslo_concurrency.lockutils [None req-da7c6806-49b2-48a5-9f27-6b6f35c0bc07 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.182 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120692.182582, 646e57e9-5637-42ce-b4f6-c1c5603f1e0e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.183 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] VM Paused (Lifecycle Event)
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.186 182729 DEBUG nova.compute.manager [None req-f2fc82b0-3050-4ec5-96be-88924d427e3a 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.213 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.217 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.246 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 22 22:24:52 compute-0 nova_compute[182725]: 2026-01-22 22:24:52.253 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:52.350 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:53 compute-0 nova_compute[182725]: 2026-01-22 22:24:53.708 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.456 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.457 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.457 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.457 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.458 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.475 182729 INFO nova.compute.manager [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Terminating instance
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.489 182729 DEBUG nova.compute.manager [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:24:55 compute-0 kernel: tapca32495f-1e (unregistering): left promiscuous mode
Jan 22 22:24:55 compute-0 NetworkManager[54954]: <info>  [1769120695.5169] device (tapca32495f-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:24:55 compute-0 ovn_controller[94850]: 2026-01-22T22:24:55Z|00158|binding|INFO|Releasing lport ca32495f-1eae-4a8d-98e8-46d372d9ac5d from this chassis (sb_readonly=0)
Jan 22 22:24:55 compute-0 ovn_controller[94850]: 2026-01-22T22:24:55Z|00159|binding|INFO|Setting lport ca32495f-1eae-4a8d-98e8-46d372d9ac5d down in Southbound
Jan 22 22:24:55 compute-0 ovn_controller[94850]: 2026-01-22T22:24:55Z|00160|binding|INFO|Removing iface tapca32495f-1e ovn-installed in OVS
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.542 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:55.549 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:98:52 10.100.0.14'], port_security=['fa:16:3e:8b:98:52 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '646e57e9-5637-42ce-b4f6-c1c5603f1e0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=ca32495f-1eae-4a8d-98e8-46d372d9ac5d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:24:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:55.551 104215 INFO neutron.agent.ovn.metadata.agent [-] Port ca32495f-1eae-4a8d-98e8-46d372d9ac5d in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad unbound from our chassis
Jan 22 22:24:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:55.554 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:24:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:55.555 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[59c521b4-69a8-48c4-a3dc-95b034c25f3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:55.556 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace which is not needed anymore
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.564 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:55 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000036.scope: Deactivated successfully.
Jan 22 22:24:55 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000036.scope: Consumed 4.011s CPU time.
Jan 22 22:24:55 compute-0 systemd-machined[154006]: Machine qemu-23-instance-00000036 terminated.
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.781 182729 INFO nova.virt.libvirt.driver [-] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Instance destroyed successfully.
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.782 182729 DEBUG nova.objects.instance [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'resources' on Instance uuid 646e57e9-5637-42ce-b4f6-c1c5603f1e0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.799 182729 DEBUG nova.virt.libvirt.vif [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-931857640',display_name='tempest-DeleteServersTestJSON-server-931857640',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-931857640',id=54,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-4w99ybxd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:52Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=646e57e9-5637-42ce-b4f6-c1c5603f1e0e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.800 182729 DEBUG nova.network.os_vif_util [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "address": "fa:16:3e:8b:98:52", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca32495f-1e", "ovs_interfaceid": "ca32495f-1eae-4a8d-98e8-46d372d9ac5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.801 182729 DEBUG nova.network.os_vif_util [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:98:52,bridge_name='br-int',has_traffic_filtering=True,id=ca32495f-1eae-4a8d-98e8-46d372d9ac5d,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca32495f-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.801 182729 DEBUG os_vif [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:98:52,bridge_name='br-int',has_traffic_filtering=True,id=ca32495f-1eae-4a8d-98e8-46d372d9ac5d,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca32495f-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.803 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.804 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca32495f-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.805 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.808 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.808 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.811 182729 INFO os_vif [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:98:52,bridge_name='br-int',has_traffic_filtering=True,id=ca32495f-1eae-4a8d-98e8-46d372d9ac5d,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca32495f-1e')
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.812 182729 INFO nova.virt.libvirt.driver [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Deleting instance files /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e_del
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.813 182729 INFO nova.virt.libvirt.driver [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Deletion of /var/lib/nova/instances/646e57e9-5637-42ce-b4f6-c1c5603f1e0e_del complete
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.904 182729 INFO nova.compute.manager [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.905 182729 DEBUG oslo.service.loopingcall [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.905 182729 DEBUG nova.compute.manager [-] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:24:55 compute-0 nova_compute[182725]: 2026-01-22 22:24:55.905 182729 DEBUG nova.network.neutron [-] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:24:55 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [NOTICE]   (218015) : haproxy version is 2.8.14-c23fe91
Jan 22 22:24:55 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [NOTICE]   (218015) : path to executable is /usr/sbin/haproxy
Jan 22 22:24:55 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [WARNING]  (218015) : Exiting Master process...
Jan 22 22:24:55 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [WARNING]  (218015) : Exiting Master process...
Jan 22 22:24:55 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [ALERT]    (218015) : Current worker (218017) exited with code 143 (Terminated)
Jan 22 22:24:55 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[217993]: [WARNING]  (218015) : All workers exited. Exiting... (0)
Jan 22 22:24:55 compute-0 systemd[1]: libpod-83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d.scope: Deactivated successfully.
Jan 22 22:24:55 compute-0 podman[218137]: 2026-01-22 22:24:55.924233396 +0000 UTC m=+0.248126809 container died 83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:24:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d-userdata-shm.mount: Deactivated successfully.
Jan 22 22:24:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-ae4db02b95e06bba98d491fa064d101707120200313b734f224f5945f22a8d7a-merged.mount: Deactivated successfully.
Jan 22 22:24:56 compute-0 nova_compute[182725]: 2026-01-22 22:24:56.814 182729 DEBUG nova.compute.manager [req-a5fa7817-a486-4f38-834b-5cd00c7e3dcd req-048a9ae7-68fb-4908-8e7f-63e3462b7675 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received event network-vif-unplugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:24:56 compute-0 nova_compute[182725]: 2026-01-22 22:24:56.815 182729 DEBUG oslo_concurrency.lockutils [req-a5fa7817-a486-4f38-834b-5cd00c7e3dcd req-048a9ae7-68fb-4908-8e7f-63e3462b7675 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:56 compute-0 nova_compute[182725]: 2026-01-22 22:24:56.816 182729 DEBUG oslo_concurrency.lockutils [req-a5fa7817-a486-4f38-834b-5cd00c7e3dcd req-048a9ae7-68fb-4908-8e7f-63e3462b7675 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:56 compute-0 nova_compute[182725]: 2026-01-22 22:24:56.816 182729 DEBUG oslo_concurrency.lockutils [req-a5fa7817-a486-4f38-834b-5cd00c7e3dcd req-048a9ae7-68fb-4908-8e7f-63e3462b7675 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:24:56 compute-0 nova_compute[182725]: 2026-01-22 22:24:56.817 182729 DEBUG nova.compute.manager [req-a5fa7817-a486-4f38-834b-5cd00c7e3dcd req-048a9ae7-68fb-4908-8e7f-63e3462b7675 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] No waiting events found dispatching network-vif-unplugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:24:56 compute-0 nova_compute[182725]: 2026-01-22 22:24:56.817 182729 DEBUG nova.compute.manager [req-a5fa7817-a486-4f38-834b-5cd00c7e3dcd req-048a9ae7-68fb-4908-8e7f-63e3462b7675 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received event network-vif-unplugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:24:57 compute-0 nova_compute[182725]: 2026-01-22 22:24:57.350 182729 DEBUG nova.network.neutron [-] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:24:57 compute-0 nova_compute[182725]: 2026-01-22 22:24:57.367 182729 INFO nova.compute.manager [-] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Took 1.46 seconds to deallocate network for instance.
Jan 22 22:24:57 compute-0 nova_compute[182725]: 2026-01-22 22:24:57.492 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:24:57 compute-0 nova_compute[182725]: 2026-01-22 22:24:57.493 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:24:57 compute-0 nova_compute[182725]: 2026-01-22 22:24:57.591 182729 DEBUG nova.compute.provider_tree [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:24:57 compute-0 podman[218137]: 2026-01-22 22:24:57.647990681 +0000 UTC m=+1.971884064 container cleanup 83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 22:24:57 compute-0 systemd[1]: libpod-conmon-83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d.scope: Deactivated successfully.
Jan 22 22:24:57 compute-0 podman[218182]: 2026-01-22 22:24:57.781038947 +0000 UTC m=+0.088546428 container remove 83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.790 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[61013212-3fc7-4eda-90ab-652979b06a1d]: (4, ('Thu Jan 22 10:24:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d)\n83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d\nThu Jan 22 10:24:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d)\n83bae6692b781b142d243dc3af1b165f5aee05d4c490931f5bb30ae3d7088a7d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.793 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[797c2372-3061-4336-a05e-cb075b4dcdf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.795 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:24:57 compute-0 nova_compute[182725]: 2026-01-22 22:24:57.797 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:57 compute-0 kernel: tap976277ea-60: left promiscuous mode
Jan 22 22:24:57 compute-0 nova_compute[182725]: 2026-01-22 22:24:57.824 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.830 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8f96690d-cac3-4103-a7e7-837462022e58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.846 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3ccbad-f3d3-471b-bf34-63429b511745]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.848 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f7e945-49e3-42e8-8a4e-ac486b5d9e64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.876 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[965ad243-8fec-4ac0-b29d-79c66308b25a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432571, 'reachable_time': 34117, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218198, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.879 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:24:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:24:57.879 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[33c433a6-e65d-4199-91be-896a0b4f4604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:24:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d976277ea\x2d61b2\x2d4223\x2da8f7\x2d3d46bf9c98ad.mount: Deactivated successfully.
Jan 22 22:24:58 compute-0 nova_compute[182725]: 2026-01-22 22:24:58.715 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.075 182729 DEBUG nova.scheduler.client.report [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.122 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.193 182729 INFO nova.scheduler.client.report [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Deleted allocations for instance 646e57e9-5637-42ce-b4f6-c1c5603f1e0e
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.268 182729 DEBUG nova.compute.manager [req-66541abd-3a83-43cd-a847-c9bea8cd7af0 req-0771a19c-d439-4fde-aac5-c65d882b4723 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received event network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.269 182729 DEBUG oslo_concurrency.lockutils [req-66541abd-3a83-43cd-a847-c9bea8cd7af0 req-0771a19c-d439-4fde-aac5-c65d882b4723 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.270 182729 DEBUG oslo_concurrency.lockutils [req-66541abd-3a83-43cd-a847-c9bea8cd7af0 req-0771a19c-d439-4fde-aac5-c65d882b4723 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.270 182729 DEBUG oslo_concurrency.lockutils [req-66541abd-3a83-43cd-a847-c9bea8cd7af0 req-0771a19c-d439-4fde-aac5-c65d882b4723 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.271 182729 DEBUG nova.compute.manager [req-66541abd-3a83-43cd-a847-c9bea8cd7af0 req-0771a19c-d439-4fde-aac5-c65d882b4723 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] No waiting events found dispatching network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.271 182729 WARNING nova.compute.manager [req-66541abd-3a83-43cd-a847-c9bea8cd7af0 req-0771a19c-d439-4fde-aac5-c65d882b4723 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received unexpected event network-vif-plugged-ca32495f-1eae-4a8d-98e8-46d372d9ac5d for instance with vm_state deleted and task_state None.
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.271 182729 DEBUG nova.compute.manager [req-66541abd-3a83-43cd-a847-c9bea8cd7af0 req-0771a19c-d439-4fde-aac5-c65d882b4723 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Received event network-vif-deleted-ca32495f-1eae-4a8d-98e8-46d372d9ac5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.343 182729 DEBUG oslo_concurrency.lockutils [None req-d87e664f-155a-431a-aa1b-e0f81eae77c3 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "646e57e9-5637-42ce-b4f6-c1c5603f1e0e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.886s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:00 compute-0 nova_compute[182725]: 2026-01-22 22:25:00.806 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:03 compute-0 ovn_controller[94850]: 2026-01-22T22:25:03Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b6:f6:68 10.100.0.8
Jan 22 22:25:03 compute-0 ovn_controller[94850]: 2026-01-22T22:25:03Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b6:f6:68 10.100.0.8
Jan 22 22:25:03 compute-0 nova_compute[182725]: 2026-01-22 22:25:03.717 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:04 compute-0 podman[218222]: 2026-01-22 22:25:04.156932066 +0000 UTC m=+0.076421958 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 22:25:04 compute-0 nova_compute[182725]: 2026-01-22 22:25:04.317 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:04 compute-0 NetworkManager[54954]: <info>  [1769120704.3181] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 22 22:25:04 compute-0 NetworkManager[54954]: <info>  [1769120704.3192] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 22 22:25:04 compute-0 nova_compute[182725]: 2026-01-22 22:25:04.405 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:04 compute-0 ovn_controller[94850]: 2026-01-22T22:25:04Z|00161|binding|INFO|Releasing lport 6d5c8dc7-0a4c-4cae-99df-8a301cedfb8f from this chassis (sb_readonly=0)
Jan 22 22:25:04 compute-0 nova_compute[182725]: 2026-01-22 22:25:04.422 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:05 compute-0 nova_compute[182725]: 2026-01-22 22:25:05.809 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:06 compute-0 podman[218244]: 2026-01-22 22:25:06.147125363 +0000 UTC m=+0.073487246 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, config_id=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Jan 22 22:25:06 compute-0 podman[218243]: 2026-01-22 22:25:06.199726183 +0000 UTC m=+0.125018689 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:25:07 compute-0 nova_compute[182725]: 2026-01-22 22:25:07.138 182729 DEBUG nova.compute.manager [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-changed-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:25:07 compute-0 nova_compute[182725]: 2026-01-22 22:25:07.139 182729 DEBUG nova.compute.manager [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Refreshing instance network info cache due to event network-changed-7c16dfb4-1f91-44d6-9c64-9b08ba223050. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:25:07 compute-0 nova_compute[182725]: 2026-01-22 22:25:07.139 182729 DEBUG oslo_concurrency.lockutils [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:25:07 compute-0 nova_compute[182725]: 2026-01-22 22:25:07.140 182729 DEBUG oslo_concurrency.lockutils [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:25:07 compute-0 nova_compute[182725]: 2026-01-22 22:25:07.141 182729 DEBUG nova.network.neutron [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Refreshing network info cache for port 7c16dfb4-1f91-44d6-9c64-9b08ba223050 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:25:08 compute-0 nova_compute[182725]: 2026-01-22 22:25:08.719 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:10 compute-0 nova_compute[182725]: 2026-01-22 22:25:10.747 182729 DEBUG nova.network.neutron [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updated VIF entry in instance network info cache for port 7c16dfb4-1f91-44d6-9c64-9b08ba223050. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:25:10 compute-0 nova_compute[182725]: 2026-01-22 22:25:10.748 182729 DEBUG nova.network.neutron [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updating instance_info_cache with network_info: [{"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:25:10 compute-0 nova_compute[182725]: 2026-01-22 22:25:10.776 182729 DEBUG oslo_concurrency.lockutils [req-91184723-4f26-4b45-8f3a-6f84ca1b7b33 req-8cdf7bb9-3456-475f-8e1d-9ab61063aeae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:25:10 compute-0 nova_compute[182725]: 2026-01-22 22:25:10.779 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120695.7779167, 646e57e9-5637-42ce-b4f6-c1c5603f1e0e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:25:10 compute-0 nova_compute[182725]: 2026-01-22 22:25:10.780 182729 INFO nova.compute.manager [-] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] VM Stopped (Lifecycle Event)
Jan 22 22:25:10 compute-0 nova_compute[182725]: 2026-01-22 22:25:10.814 182729 DEBUG nova.compute.manager [None req-f157cb2a-88db-4cf6-bc74-cab754fff77c - - - - - -] [instance: 646e57e9-5637-42ce-b4f6-c1c5603f1e0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:25:10 compute-0 nova_compute[182725]: 2026-01-22 22:25:10.815 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:11 compute-0 nova_compute[182725]: 2026-01-22 22:25:11.913 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:11 compute-0 nova_compute[182725]: 2026-01-22 22:25:11.914 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:12.433 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:12.434 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:12.435 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.366 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.367 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.367 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.368 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.607 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.709 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.711 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.746 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:13 compute-0 nova_compute[182725]: 2026-01-22 22:25:13.824 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a/disk --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:25:14 compute-0 nova_compute[182725]: 2026-01-22 22:25:14.067 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:25:14 compute-0 nova_compute[182725]: 2026-01-22 22:25:14.068 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5464MB free_disk=73.34786224365234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:25:14 compute-0 nova_compute[182725]: 2026-01-22 22:25:14.068 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:14 compute-0 nova_compute[182725]: 2026-01-22 22:25:14.069 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:14 compute-0 nova_compute[182725]: 2026-01-22 22:25:14.967 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 2c72a954-328b-49cc-a01f-afd216318d5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:25:14 compute-0 nova_compute[182725]: 2026-01-22 22:25:14.967 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:25:14 compute-0 nova_compute[182725]: 2026-01-22 22:25:14.967 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:25:15 compute-0 nova_compute[182725]: 2026-01-22 22:25:15.035 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:25:15 compute-0 nova_compute[182725]: 2026-01-22 22:25:15.055 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:25:15 compute-0 nova_compute[182725]: 2026-01-22 22:25:15.102 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:25:15 compute-0 nova_compute[182725]: 2026-01-22 22:25:15.102 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:15 compute-0 nova_compute[182725]: 2026-01-22 22:25:15.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:15 compute-0 nova_compute[182725]: 2026-01-22 22:25:15.818 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:16 compute-0 nova_compute[182725]: 2026-01-22 22:25:16.077 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:16 compute-0 nova_compute[182725]: 2026-01-22 22:25:16.078 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:25:16 compute-0 nova_compute[182725]: 2026-01-22 22:25:16.079 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:25:17 compute-0 podman[218299]: 2026-01-22 22:25:17.143635946 +0000 UTC m=+0.066335408 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:25:17 compute-0 podman[218300]: 2026-01-22 22:25:17.145292037 +0000 UTC m=+0.064624764 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:25:17 compute-0 nova_compute[182725]: 2026-01-22 22:25:17.667 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:25:17 compute-0 nova_compute[182725]: 2026-01-22 22:25:17.667 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:25:17 compute-0 nova_compute[182725]: 2026-01-22 22:25:17.668 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:25:17 compute-0 nova_compute[182725]: 2026-01-22 22:25:17.668 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2c72a954-328b-49cc-a01f-afd216318d5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:25:18 compute-0 nova_compute[182725]: 2026-01-22 22:25:18.725 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:20 compute-0 podman[218343]: 2026-01-22 22:25:20.162383843 +0000 UTC m=+0.084829799 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:25:20 compute-0 nova_compute[182725]: 2026-01-22 22:25:20.823 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.214 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updating instance_info_cache with network_info: [{"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.240 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.240 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.241 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.242 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.242 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.242 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.242 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.243 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:21 compute-0 nova_compute[182725]: 2026-01-22 22:25:21.243 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:25:23 compute-0 nova_compute[182725]: 2026-01-22 22:25:23.729 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:24 compute-0 nova_compute[182725]: 2026-01-22 22:25:24.049 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:25:25 compute-0 nova_compute[182725]: 2026-01-22 22:25:25.825 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:26 compute-0 ovn_controller[94850]: 2026-01-22T22:25:26Z|00162|binding|INFO|Releasing lport 6d5c8dc7-0a4c-4cae-99df-8a301cedfb8f from this chassis (sb_readonly=0)
Jan 22 22:25:26 compute-0 nova_compute[182725]: 2026-01-22 22:25:26.204 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:27 compute-0 nova_compute[182725]: 2026-01-22 22:25:27.608 182729 DEBUG nova.compute.manager [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-changed-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:25:27 compute-0 nova_compute[182725]: 2026-01-22 22:25:27.608 182729 DEBUG nova.compute.manager [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Refreshing instance network info cache due to event network-changed-7c16dfb4-1f91-44d6-9c64-9b08ba223050. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:25:27 compute-0 nova_compute[182725]: 2026-01-22 22:25:27.609 182729 DEBUG oslo_concurrency.lockutils [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:25:27 compute-0 nova_compute[182725]: 2026-01-22 22:25:27.609 182729 DEBUG oslo_concurrency.lockutils [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:25:27 compute-0 nova_compute[182725]: 2026-01-22 22:25:27.610 182729 DEBUG nova.network.neutron [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Refreshing network info cache for port 7c16dfb4-1f91-44d6-9c64-9b08ba223050 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:25:28 compute-0 nova_compute[182725]: 2026-01-22 22:25:28.731 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:30 compute-0 nova_compute[182725]: 2026-01-22 22:25:30.829 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:31 compute-0 nova_compute[182725]: 2026-01-22 22:25:31.109 182729 DEBUG nova.network.neutron [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updated VIF entry in instance network info cache for port 7c16dfb4-1f91-44d6-9c64-9b08ba223050. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:25:31 compute-0 nova_compute[182725]: 2026-01-22 22:25:31.110 182729 DEBUG nova.network.neutron [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updating instance_info_cache with network_info: [{"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:25:31 compute-0 nova_compute[182725]: 2026-01-22 22:25:31.172 182729 DEBUG oslo_concurrency.lockutils [req-8745fae0-e9a8-4a3c-9f2c-c7d3c14fe501 req-063d6c78-b55e-48a1-9be4-83fc45ba05db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2c72a954-328b-49cc-a01f-afd216318d5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:25:33 compute-0 nova_compute[182725]: 2026-01-22 22:25:33.733 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:35 compute-0 podman[218368]: 2026-01-22 22:25:35.131992889 +0000 UTC m=+0.062936323 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:25:35 compute-0 nova_compute[182725]: 2026-01-22 22:25:35.831 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:37 compute-0 podman[218389]: 2026-01-22 22:25:37.133219651 +0000 UTC m=+0.062884041 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Jan 22 22:25:37 compute-0 podman[218388]: 2026-01-22 22:25:37.147944599 +0000 UTC m=+0.085088956 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.735 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.916 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.918 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.928 182729 INFO nova.compute.manager [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Terminating instance
Jan 22 22:25:38 compute-0 nova_compute[182725]: 2026-01-22 22:25:38.938 182729 DEBUG nova.compute.manager [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:25:39 compute-0 kernel: tap7c16dfb4-1f (unregistering): left promiscuous mode
Jan 22 22:25:39 compute-0 NetworkManager[54954]: <info>  [1769120739.0565] device (tap7c16dfb4-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.064 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:39 compute-0 ovn_controller[94850]: 2026-01-22T22:25:39Z|00163|binding|INFO|Releasing lport 7c16dfb4-1f91-44d6-9c64-9b08ba223050 from this chassis (sb_readonly=0)
Jan 22 22:25:39 compute-0 ovn_controller[94850]: 2026-01-22T22:25:39Z|00164|binding|INFO|Setting lport 7c16dfb4-1f91-44d6-9c64-9b08ba223050 down in Southbound
Jan 22 22:25:39 compute-0 ovn_controller[94850]: 2026-01-22T22:25:39Z|00165|binding|INFO|Removing iface tap7c16dfb4-1f ovn-installed in OVS
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.065 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:39.074 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:f6:68 10.100.0.8'], port_security=['fa:16:3e:b6:f6:68 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2c72a954-328b-49cc-a01f-afd216318d5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-effa305d-5c76-462b-aabf-288608d21c44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3fca4180814795922e49898f54c932', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b02bd21-6139-4053-849a-5d4cf68f4b67', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=096a816c-fd03-4c8c-abf2-09541493bfda, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=7c16dfb4-1f91-44d6-9c64-9b08ba223050) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:25:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:39.076 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 7c16dfb4-1f91-44d6-9c64-9b08ba223050 in datapath effa305d-5c76-462b-aabf-288608d21c44 unbound from our chassis
Jan 22 22:25:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:39.077 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network effa305d-5c76-462b-aabf-288608d21c44, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:25:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:39.079 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[463b586c-5b6f-4126-86dc-883c45d2e4b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:39.080 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-effa305d-5c76-462b-aabf-288608d21c44 namespace which is not needed anymore
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.083 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:39 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 22 22:25:39 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000037.scope: Consumed 14.767s CPU time.
Jan 22 22:25:39 compute-0 systemd-machined[154006]: Machine qemu-24-instance-00000037 terminated.
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.162 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.167 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.202 182729 INFO nova.virt.libvirt.driver [-] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Instance destroyed successfully.
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.204 182729 DEBUG nova.objects.instance [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lazy-loading 'resources' on Instance uuid 2c72a954-328b-49cc-a01f-afd216318d5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.226 182729 DEBUG nova.virt.libvirt.vif [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-263288411',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-263288411',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-263288411',id=55,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6c3fca4180814795922e49898f54c932',ramdisk_id='',reservation_id='r-6nf41k0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1086027111',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1086027111-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:51Z,user_data=None,user_id='60428d874c4b471889bd0e3c182d3b9a',uuid=2c72a954-328b-49cc-a01f-afd216318d5a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.227 182729 DEBUG nova.network.os_vif_util [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Converting VIF {"id": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "address": "fa:16:3e:b6:f6:68", "network": {"id": "effa305d-5c76-462b-aabf-288608d21c44", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1683602142-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6c3fca4180814795922e49898f54c932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c16dfb4-1f", "ovs_interfaceid": "7c16dfb4-1f91-44d6-9c64-9b08ba223050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.228 182729 DEBUG nova.network.os_vif_util [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:f6:68,bridge_name='br-int',has_traffic_filtering=True,id=7c16dfb4-1f91-44d6-9c64-9b08ba223050,network=Network(effa305d-5c76-462b-aabf-288608d21c44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c16dfb4-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.229 182729 DEBUG os_vif [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:f6:68,bridge_name='br-int',has_traffic_filtering=True,id=7c16dfb4-1f91-44d6-9c64-9b08ba223050,network=Network(effa305d-5c76-462b-aabf-288608d21c44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c16dfb4-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.231 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.231 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c16dfb4-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.235 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.238 182729 INFO os_vif [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:f6:68,bridge_name='br-int',has_traffic_filtering=True,id=7c16dfb4-1f91-44d6-9c64-9b08ba223050,network=Network(effa305d-5c76-462b-aabf-288608d21c44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c16dfb4-1f')
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.239 182729 INFO nova.virt.libvirt.driver [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Deleting instance files /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a_del
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.240 182729 INFO nova.virt.libvirt.driver [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Deletion of /var/lib/nova/instances/2c72a954-328b-49cc-a01f-afd216318d5a_del complete
Jan 22 22:25:39 compute-0 neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44[218099]: [NOTICE]   (218103) : haproxy version is 2.8.14-c23fe91
Jan 22 22:25:39 compute-0 neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44[218099]: [NOTICE]   (218103) : path to executable is /usr/sbin/haproxy
Jan 22 22:25:39 compute-0 neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44[218099]: [WARNING]  (218103) : Exiting Master process...
Jan 22 22:25:39 compute-0 neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44[218099]: [ALERT]    (218103) : Current worker (218105) exited with code 143 (Terminated)
Jan 22 22:25:39 compute-0 neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44[218099]: [WARNING]  (218103) : All workers exited. Exiting... (0)
Jan 22 22:25:39 compute-0 systemd[1]: libpod-76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47.scope: Deactivated successfully.
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.347 182729 INFO nova.compute.manager [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.348 182729 DEBUG oslo.service.loopingcall [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.348 182729 DEBUG nova.compute.manager [-] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.348 182729 DEBUG nova.network.neutron [-] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:25:39 compute-0 podman[218460]: 2026-01-22 22:25:39.350743966 +0000 UTC m=+0.178206970 container died 76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.498 182729 DEBUG nova.compute.manager [req-cea164c3-40c7-4ce0-9636-1b3e68ad036a req-4e43bad6-dda8-4df8-bdfc-0f127e55ea45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-vif-unplugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.498 182729 DEBUG oslo_concurrency.lockutils [req-cea164c3-40c7-4ce0-9636-1b3e68ad036a req-4e43bad6-dda8-4df8-bdfc-0f127e55ea45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.499 182729 DEBUG oslo_concurrency.lockutils [req-cea164c3-40c7-4ce0-9636-1b3e68ad036a req-4e43bad6-dda8-4df8-bdfc-0f127e55ea45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.499 182729 DEBUG oslo_concurrency.lockutils [req-cea164c3-40c7-4ce0-9636-1b3e68ad036a req-4e43bad6-dda8-4df8-bdfc-0f127e55ea45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.500 182729 DEBUG nova.compute.manager [req-cea164c3-40c7-4ce0-9636-1b3e68ad036a req-4e43bad6-dda8-4df8-bdfc-0f127e55ea45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] No waiting events found dispatching network-vif-unplugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:25:39 compute-0 nova_compute[182725]: 2026-01-22 22:25:39.500 182729 DEBUG nova.compute.manager [req-cea164c3-40c7-4ce0-9636-1b3e68ad036a req-4e43bad6-dda8-4df8-bdfc-0f127e55ea45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-vif-unplugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:25:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47-userdata-shm.mount: Deactivated successfully.
Jan 22 22:25:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-7b2d1d824682a12c47db1c5660e684924b8ef494d4ceea6384dac9b1ea7b9bae-merged.mount: Deactivated successfully.
Jan 22 22:25:39 compute-0 podman[218460]: 2026-01-22 22:25:39.927496615 +0000 UTC m=+0.754959619 container cleanup 76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 22:25:40 compute-0 podman[218502]: 2026-01-22 22:25:40.182823288 +0000 UTC m=+0.228325241 container remove 76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.188 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[df930c66-11c2-4fcb-ae9f-e265f66dbd9b]: (4, ('Thu Jan 22 10:25:39 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44 (76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47)\n76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47\nThu Jan 22 10:25:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-effa305d-5c76-462b-aabf-288608d21c44 (76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47)\n76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.190 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5474589d-9e6f-48a5-96c8-cb4904bb93fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.191 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeffa305d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:25:40 compute-0 nova_compute[182725]: 2026-01-22 22:25:40.194 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:40 compute-0 kernel: tapeffa305d-50: left promiscuous mode
Jan 22 22:25:40 compute-0 systemd[1]: libpod-conmon-76de37f304b7853280b3ff6788dda084200faacced5a46051c64c0c14378db47.scope: Deactivated successfully.
Jan 22 22:25:40 compute-0 nova_compute[182725]: 2026-01-22 22:25:40.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.208 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e09a6e7d-7faf-4996-82eb-5d50d0bc92a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.223 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7fa90499-a68f-4487-bc0b-99eb687031e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.224 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[06ef9d36-610c-477b-aebd-741f0ebc762e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.243 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[07a78180-403d-4f66-a2bb-a172412d16ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432722, 'reachable_time': 38607, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218517, 'error': None, 'target': 'ovnmeta-effa305d-5c76-462b-aabf-288608d21c44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.246 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-effa305d-5c76-462b-aabf-288608d21c44 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:25:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:40.246 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[2dff0d98-95aa-4d6e-83f6-f48e14d3c489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:25:40 compute-0 systemd[1]: run-netns-ovnmeta\x2deffa305d\x2d5c76\x2d462b\x2daabf\x2d288608d21c44.mount: Deactivated successfully.
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.537 182729 DEBUG nova.network.neutron [-] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.561 182729 INFO nova.compute.manager [-] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Took 2.21 seconds to deallocate network for instance.
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.644 182729 DEBUG nova.compute.manager [req-117a8508-1d81-43b8-b49f-0b096a985c23 req-5f549a43-65b5-4a81-aa38-ed9d56e39768 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-vif-deleted-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.691 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.691 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.729 182729 DEBUG nova.scheduler.client.report [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.739 182729 DEBUG nova.compute.manager [req-627a1d94-5eae-4c31-a6a5-a33865902303 req-a4d777f2-4ba5-4a84-b370-2e1f45e33ec6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received event network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.739 182729 DEBUG oslo_concurrency.lockutils [req-627a1d94-5eae-4c31-a6a5-a33865902303 req-a4d777f2-4ba5-4a84-b370-2e1f45e33ec6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.740 182729 DEBUG oslo_concurrency.lockutils [req-627a1d94-5eae-4c31-a6a5-a33865902303 req-a4d777f2-4ba5-4a84-b370-2e1f45e33ec6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.740 182729 DEBUG oslo_concurrency.lockutils [req-627a1d94-5eae-4c31-a6a5-a33865902303 req-a4d777f2-4ba5-4a84-b370-2e1f45e33ec6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.740 182729 DEBUG nova.compute.manager [req-627a1d94-5eae-4c31-a6a5-a33865902303 req-a4d777f2-4ba5-4a84-b370-2e1f45e33ec6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] No waiting events found dispatching network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.740 182729 WARNING nova.compute.manager [req-627a1d94-5eae-4c31-a6a5-a33865902303 req-a4d777f2-4ba5-4a84-b370-2e1f45e33ec6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Received unexpected event network-vif-plugged-7c16dfb4-1f91-44d6-9c64-9b08ba223050 for instance with vm_state deleted and task_state None.
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.753 182729 DEBUG nova.scheduler.client.report [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.753 182729 DEBUG nova.compute.provider_tree [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.770 182729 DEBUG nova.scheduler.client.report [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.799 182729 DEBUG nova.scheduler.client.report [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.894 182729 DEBUG nova.compute.provider_tree [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.910 182729 DEBUG nova.scheduler.client.report [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.941 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:41 compute-0 nova_compute[182725]: 2026-01-22 22:25:41.987 182729 INFO nova.scheduler.client.report [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Deleted allocations for instance 2c72a954-328b-49cc-a01f-afd216318d5a
Jan 22 22:25:42 compute-0 nova_compute[182725]: 2026-01-22 22:25:42.063 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:42 compute-0 nova_compute[182725]: 2026-01-22 22:25:42.130 182729 DEBUG oslo_concurrency.lockutils [None req-9e68d689-bca0-4e99-9859-c77cf3689a71 60428d874c4b471889bd0e3c182d3b9a 6c3fca4180814795922e49898f54c932 - - default default] Lock "2c72a954-328b-49cc-a01f-afd216318d5a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:25:43 compute-0 nova_compute[182725]: 2026-01-22 22:25:43.738 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:44 compute-0 nova_compute[182725]: 2026-01-22 22:25:44.233 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:44.729 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:25:44 compute-0 nova_compute[182725]: 2026-01-22 22:25:44.730 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:44.731 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:25:44 compute-0 nova_compute[182725]: 2026-01-22 22:25:44.955 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:45 compute-0 nova_compute[182725]: 2026-01-22 22:25:45.121 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:48 compute-0 podman[218519]: 2026-01-22 22:25:48.139708304 +0000 UTC m=+0.071517197 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 22:25:48 compute-0 podman[218520]: 2026-01-22 22:25:48.150169865 +0000 UTC m=+0.068760748 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:25:48 compute-0 nova_compute[182725]: 2026-01-22 22:25:48.739 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:49 compute-0 nova_compute[182725]: 2026-01-22 22:25:49.235 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:51 compute-0 podman[218564]: 2026-01-22 22:25:51.148662406 +0000 UTC m=+0.077738862 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:25:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:25:52.733 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:25:53 compute-0 nova_compute[182725]: 2026-01-22 22:25:53.741 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:54 compute-0 nova_compute[182725]: 2026-01-22 22:25:54.201 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120739.1995807, 2c72a954-328b-49cc-a01f-afd216318d5a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:25:54 compute-0 nova_compute[182725]: 2026-01-22 22:25:54.201 182729 INFO nova.compute.manager [-] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] VM Stopped (Lifecycle Event)
Jan 22 22:25:54 compute-0 nova_compute[182725]: 2026-01-22 22:25:54.234 182729 DEBUG nova.compute.manager [None req-658a842e-9803-4a1e-b82d-b4665237ba28 - - - - - -] [instance: 2c72a954-328b-49cc-a01f-afd216318d5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:25:54 compute-0 nova_compute[182725]: 2026-01-22 22:25:54.237 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:58 compute-0 nova_compute[182725]: 2026-01-22 22:25:58.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:25:59 compute-0 nova_compute[182725]: 2026-01-22 22:25:59.239 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:03 compute-0 nova_compute[182725]: 2026-01-22 22:26:03.745 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:04 compute-0 nova_compute[182725]: 2026-01-22 22:26:04.242 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:06 compute-0 podman[218586]: 2026-01-22 22:26:06.148576808 +0000 UTC m=+0.077274980 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 22:26:08 compute-0 podman[218607]: 2026-01-22 22:26:08.130000198 +0000 UTC m=+0.063044455 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public)
Jan 22 22:26:08 compute-0 podman[218606]: 2026-01-22 22:26:08.202396775 +0000 UTC m=+0.126671783 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:26:08 compute-0 nova_compute[182725]: 2026-01-22 22:26:08.748 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:26:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:26:09 compute-0 nova_compute[182725]: 2026-01-22 22:26:09.245 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:11 compute-0 nova_compute[182725]: 2026-01-22 22:26:11.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:11 compute-0 nova_compute[182725]: 2026-01-22 22:26:11.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:11 compute-0 nova_compute[182725]: 2026-01-22 22:26:11.921 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:11 compute-0 nova_compute[182725]: 2026-01-22 22:26:11.921 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:11 compute-0 nova_compute[182725]: 2026-01-22 22:26:11.921 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:11 compute-0 nova_compute[182725]: 2026-01-22 22:26:11.922 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.105 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.107 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.37693405151367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.107 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.107 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.167 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.167 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.195 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.217 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.248 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:26:12 compute-0 nova_compute[182725]: 2026-01-22 22:26:12.249 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:12.433 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:12.434 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:12.434 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:13 compute-0 nova_compute[182725]: 2026-01-22 22:26:13.249 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:13 compute-0 nova_compute[182725]: 2026-01-22 22:26:13.250 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:26:13 compute-0 nova_compute[182725]: 2026-01-22 22:26:13.250 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:26:13 compute-0 nova_compute[182725]: 2026-01-22 22:26:13.271 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:26:13 compute-0 nova_compute[182725]: 2026-01-22 22:26:13.750 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:14 compute-0 nova_compute[182725]: 2026-01-22 22:26:14.248 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:14 compute-0 nova_compute[182725]: 2026-01-22 22:26:14.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:14 compute-0 nova_compute[182725]: 2026-01-22 22:26:14.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:26:15 compute-0 nova_compute[182725]: 2026-01-22 22:26:15.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:15 compute-0 nova_compute[182725]: 2026-01-22 22:26:15.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:18 compute-0 nova_compute[182725]: 2026-01-22 22:26:18.752 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:18 compute-0 podman[218654]: 2026-01-22 22:26:18.847606848 +0000 UTC m=+0.059954328 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:26:18 compute-0 podman[218655]: 2026-01-22 22:26:18.855556596 +0000 UTC m=+0.062119392 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:26:18 compute-0 nova_compute[182725]: 2026-01-22 22:26:18.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:18 compute-0 nova_compute[182725]: 2026-01-22 22:26:18.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:19 compute-0 nova_compute[182725]: 2026-01-22 22:26:19.251 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:19 compute-0 nova_compute[182725]: 2026-01-22 22:26:19.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:26:22 compute-0 podman[218696]: 2026-01-22 22:26:22.163718078 +0000 UTC m=+0.069472376 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:26:23 compute-0 nova_compute[182725]: 2026-01-22 22:26:23.761 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:24 compute-0 nova_compute[182725]: 2026-01-22 22:26:24.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:28 compute-0 nova_compute[182725]: 2026-01-22 22:26:28.766 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:29 compute-0 nova_compute[182725]: 2026-01-22 22:26:29.257 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:33 compute-0 nova_compute[182725]: 2026-01-22 22:26:33.769 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.010 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.010 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.028 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.176 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.176 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.183 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.183 182729 INFO nova.compute.claims [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.259 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.398 182729 DEBUG nova.compute.provider_tree [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.412 182729 DEBUG nova.scheduler.client.report [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.434 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.435 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.533 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.534 182729 DEBUG nova.network.neutron [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.586 182729 INFO nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.617 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.784 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.786 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.787 182729 INFO nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Creating image(s)
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.788 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "/var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.788 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.789 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.806 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.868 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.869 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.870 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.883 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.949 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.950 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.992 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.993 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:34 compute-0 nova_compute[182725]: 2026-01-22 22:26:34.994 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.053 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.054 182729 DEBUG nova.virt.disk.api [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Checking if we can resize image /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.054 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.114 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.115 182729 DEBUG nova.virt.disk.api [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Cannot resize image /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.115 182729 DEBUG nova.objects.instance [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f855bce-32a6-4b71-b6fd-0647459a5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.147 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.147 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Ensure instance console log exists: /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.148 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.148 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.149 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:35 compute-0 nova_compute[182725]: 2026-01-22 22:26:35.315 182729 DEBUG nova.policy [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95cf9999380d48108a561554c1897f15', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:26:37 compute-0 podman[218735]: 2026-01-22 22:26:37.118234097 +0000 UTC m=+0.053743613 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:26:37 compute-0 nova_compute[182725]: 2026-01-22 22:26:37.422 182729 DEBUG nova.network.neutron [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Successfully created port: a974255e-8591-4ca5-9d2b-5f038b1ff572 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.385 182729 DEBUG nova.network.neutron [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Successfully updated port: a974255e-8591-4ca5-9d2b-5f038b1ff572 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.414 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-5f855bce-32a6-4b71-b6fd-0647459a5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.415 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-5f855bce-32a6-4b71-b6fd-0647459a5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.415 182729 DEBUG nova.network.neutron [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.547 182729 DEBUG nova.compute.manager [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received event network-changed-a974255e-8591-4ca5-9d2b-5f038b1ff572 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.548 182729 DEBUG nova.compute.manager [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Refreshing instance network info cache due to event network-changed-a974255e-8591-4ca5-9d2b-5f038b1ff572. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.548 182729 DEBUG oslo_concurrency.lockutils [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5f855bce-32a6-4b71-b6fd-0647459a5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.653 182729 DEBUG nova.network.neutron [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:26:38 compute-0 nova_compute[182725]: 2026-01-22 22:26:38.771 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:39 compute-0 podman[218756]: 2026-01-22 22:26:39.137927952 +0000 UTC m=+0.060577763 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, version=9.6, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Jan 22 22:26:39 compute-0 podman[218755]: 2026-01-22 22:26:39.167519651 +0000 UTC m=+0.096625023 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 22:26:39 compute-0 nova_compute[182725]: 2026-01-22 22:26:39.261 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.219 182729 DEBUG nova.network.neutron [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Updating instance_info_cache with network_info: [{"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.260 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-5f855bce-32a6-4b71-b6fd-0647459a5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.261 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Instance network_info: |[{"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.262 182729 DEBUG oslo_concurrency.lockutils [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5f855bce-32a6-4b71-b6fd-0647459a5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.262 182729 DEBUG nova.network.neutron [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Refreshing network info cache for port a974255e-8591-4ca5-9d2b-5f038b1ff572 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.266 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Start _get_guest_xml network_info=[{"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.272 182729 WARNING nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.279 182729 DEBUG nova.virt.libvirt.host [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.279 182729 DEBUG nova.virt.libvirt.host [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.285 182729 DEBUG nova.virt.libvirt.host [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.285 182729 DEBUG nova.virt.libvirt.host [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.287 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.287 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.288 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.288 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.288 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.289 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.289 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.289 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.290 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.290 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.290 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.290 182729 DEBUG nova.virt.hardware [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.295 182729 DEBUG nova.virt.libvirt.vif [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-34337658',display_name='tempest-DeleteServersTestJSON-server-34337658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-34337658',id=61,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-cb0kh5cm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:34Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=5f855bce-32a6-4b71-b6fd-0647459a5872,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.295 182729 DEBUG nova.network.os_vif_util [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.296 182729 DEBUG nova.network.os_vif_util [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:ea,bridge_name='br-int',has_traffic_filtering=True,id=a974255e-8591-4ca5-9d2b-5f038b1ff572,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa974255e-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.298 182729 DEBUG nova.objects.instance [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f855bce-32a6-4b71-b6fd-0647459a5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.311 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <uuid>5f855bce-32a6-4b71-b6fd-0647459a5872</uuid>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <name>instance-0000003d</name>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <nova:name>tempest-DeleteServersTestJSON-server-34337658</nova:name>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:26:40</nova:creationTime>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:user uuid="95cf9999380d48108a561554c1897f15">tempest-DeleteServersTestJSON-1655437746-project-member</nova:user>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:project uuid="9f8f780ce45a4950b1666a54cd9a5ba0">tempest-DeleteServersTestJSON-1655437746</nova:project>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         <nova:port uuid="a974255e-8591-4ca5-9d2b-5f038b1ff572">
Jan 22 22:26:40 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <system>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <entry name="serial">5f855bce-32a6-4b71-b6fd-0647459a5872</entry>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <entry name="uuid">5f855bce-32a6-4b71-b6fd-0647459a5872</entry>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </system>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <os>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   </os>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <features>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   </features>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk.config"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:b8:79:ea"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <target dev="tapa974255e-85"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/console.log" append="off"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <video>
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </video>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:26:40 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:26:40 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:26:40 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:26:40 compute-0 nova_compute[182725]: </domain>
Jan 22 22:26:40 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.312 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Preparing to wait for external event network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.313 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.313 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.313 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.314 182729 DEBUG nova.virt.libvirt.vif [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-34337658',display_name='tempest-DeleteServersTestJSON-server-34337658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-34337658',id=61,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-cb0kh5cm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:34Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=5f855bce-32a6-4b71-b6fd-0647459a5872,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.314 182729 DEBUG nova.network.os_vif_util [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.315 182729 DEBUG nova.network.os_vif_util [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:ea,bridge_name='br-int',has_traffic_filtering=True,id=a974255e-8591-4ca5-9d2b-5f038b1ff572,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa974255e-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.316 182729 DEBUG os_vif [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:ea,bridge_name='br-int',has_traffic_filtering=True,id=a974255e-8591-4ca5-9d2b-5f038b1ff572,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa974255e-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.316 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.317 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.317 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.320 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.320 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa974255e-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.321 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa974255e-85, col_values=(('external_ids', {'iface-id': 'a974255e-8591-4ca5-9d2b-5f038b1ff572', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:79:ea', 'vm-uuid': '5f855bce-32a6-4b71-b6fd-0647459a5872'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:40 compute-0 NetworkManager[54954]: <info>  [1769120800.3238] manager: (tapa974255e-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.326 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.330 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.331 182729 INFO os_vif [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:ea,bridge_name='br-int',has_traffic_filtering=True,id=a974255e-8591-4ca5-9d2b-5f038b1ff572,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa974255e-85')
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.388 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.389 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.389 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No VIF found with MAC fa:16:3e:b8:79:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:26:40 compute-0 nova_compute[182725]: 2026-01-22 22:26:40.390 182729 INFO nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Using config drive
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.161 182729 INFO nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Creating config drive at /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk.config
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.169 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvn56u50m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.303 182729 DEBUG oslo_concurrency.processutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvn56u50m" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:41 compute-0 kernel: tapa974255e-85: entered promiscuous mode
Jan 22 22:26:41 compute-0 NetworkManager[54954]: <info>  [1769120801.3753] manager: (tapa974255e-85): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Jan 22 22:26:41 compute-0 ovn_controller[94850]: 2026-01-22T22:26:41Z|00166|binding|INFO|Claiming lport a974255e-8591-4ca5-9d2b-5f038b1ff572 for this chassis.
Jan 22 22:26:41 compute-0 ovn_controller[94850]: 2026-01-22T22:26:41Z|00167|binding|INFO|a974255e-8591-4ca5-9d2b-5f038b1ff572: Claiming fa:16:3e:b8:79:ea 10.100.0.6
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.374 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.379 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.381 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 systemd-udevd[218820]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.403 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:79:ea 10.100.0.6'], port_security=['fa:16:3e:b8:79:ea 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5f855bce-32a6-4b71-b6fd-0647459a5872', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=a974255e-8591-4ca5-9d2b-5f038b1ff572) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.405 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a974255e-8591-4ca5-9d2b-5f038b1ff572 in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad bound to our chassis
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.406 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:26:41 compute-0 systemd-machined[154006]: New machine qemu-25-instance-0000003d.
Jan 22 22:26:41 compute-0 NetworkManager[54954]: <info>  [1769120801.4227] device (tapa974255e-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.421 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c0543243-c01b-4e39-bf95-01c4bc0c0aa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.423 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap976277ea-61 in ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:26:41 compute-0 NetworkManager[54954]: <info>  [1769120801.4244] device (tapa974255e-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.425 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap976277ea-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.425 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf019b3-f347-4b8b-9f4c-051e304ae5be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.426 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[60ca2c36-3a1e-4ffa-a9ee-321e285f7645]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.436 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 ovn_controller[94850]: 2026-01-22T22:26:41Z|00168|binding|INFO|Setting lport a974255e-8591-4ca5-9d2b-5f038b1ff572 ovn-installed in OVS
Jan 22 22:26:41 compute-0 ovn_controller[94850]: 2026-01-22T22:26:41Z|00169|binding|INFO|Setting lport a974255e-8591-4ca5-9d2b-5f038b1ff572 up in Southbound
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.440 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[e1531cb4-aa02-478f-a63d-6e5fa71eb006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.441 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-0000003d.
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.465 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1ba36bf9-1dfe-4fcd-be5d-a32304e71fda]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.502 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[14de5286-46e3-4919-8bc7-9d2cc90453bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.508 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d47a98-89c4-4e61-ad0c-95f2fb5d2293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 NetworkManager[54954]: <info>  [1769120801.5092] manager: (tap976277ea-60): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Jan 22 22:26:41 compute-0 systemd-udevd[218824]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.542 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[363183ea-4dcc-4bf8-9ef1-60509094d449]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.546 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b988de9d-95c7-4738-8066-2b973b7876ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 NetworkManager[54954]: <info>  [1769120801.5710] device (tap976277ea-60): carrier: link connected
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.579 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc8b8e6-b82c-4815-8cbf-0a1c60b8370d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.599 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[591d271a-0947-4369-b6a9-949814aed0f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443917, 'reachable_time': 28730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218854, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.621 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[92da64a2-4dda-4a3a-97c7-3dd8cc13cf12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:95d6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443917, 'tstamp': 443917}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218855, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.642 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0590ad86-37cf-4448-ba36-fa8a8805eda5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443917, 'reachable_time': 28730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218856, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.680 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[be6eb09f-a606-4853-9230-cfeee76d1097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.746 182729 DEBUG nova.compute.manager [req-8f1cc45c-c24c-4e9b-bc40-7d78fb4522d4 req-ca4ad0c2-c98e-42f7-ad19-f7f5dda5ca14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received event network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.747 182729 DEBUG oslo_concurrency.lockutils [req-8f1cc45c-c24c-4e9b-bc40-7d78fb4522d4 req-ca4ad0c2-c98e-42f7-ad19-f7f5dda5ca14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.747 182729 DEBUG oslo_concurrency.lockutils [req-8f1cc45c-c24c-4e9b-bc40-7d78fb4522d4 req-ca4ad0c2-c98e-42f7-ad19-f7f5dda5ca14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.747 182729 DEBUG oslo_concurrency.lockutils [req-8f1cc45c-c24c-4e9b-bc40-7d78fb4522d4 req-ca4ad0c2-c98e-42f7-ad19-f7f5dda5ca14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.748 182729 DEBUG nova.compute.manager [req-8f1cc45c-c24c-4e9b-bc40-7d78fb4522d4 req-ca4ad0c2-c98e-42f7-ad19-f7f5dda5ca14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Processing event network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.752 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7d96b2-1ecf-4cc2-a3ea-4680dd4618c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.754 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.754 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.755 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap976277ea-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:41 compute-0 NetworkManager[54954]: <info>  [1769120801.7575] manager: (tap976277ea-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 22 22:26:41 compute-0 kernel: tap976277ea-60: entered promiscuous mode
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.757 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.760 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap976277ea-60, col_values=(('external_ids', {'iface-id': '06db452d-91a0-4ebb-b584-a57953634a03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.762 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 ovn_controller[94850]: 2026-01-22T22:26:41Z|00170|binding|INFO|Releasing lport 06db452d-91a0-4ebb-b584-a57953634a03 from this chassis (sb_readonly=0)
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.764 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.765 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.766 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2dd7ad5f-5df3-4ffd-a017-0e3d962a4487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.766 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:26:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:41.767 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'env', 'PROCESS_TAG=haproxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/976277ea-61b2-4223-a8f7-3d46bf9c98ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.775 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.965 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120801.964831, 5f855bce-32a6-4b71-b6fd-0647459a5872 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.966 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] VM Started (Lifecycle Event)
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.968 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.973 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.977 182729 INFO nova.virt.libvirt.driver [-] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Instance spawned successfully.
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.977 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:26:41 compute-0 nova_compute[182725]: 2026-01-22 22:26:41.999 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.006 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.012 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.012 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.012 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.013 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.013 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.014 182729 DEBUG nova.virt.libvirt.driver [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.046 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.046 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120801.966207, 5f855bce-32a6-4b71-b6fd-0647459a5872 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.047 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] VM Paused (Lifecycle Event)
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.083 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.088 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120801.9725802, 5f855bce-32a6-4b71-b6fd-0647459a5872 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.089 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] VM Resumed (Lifecycle Event)
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.115 182729 INFO nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Took 7.33 seconds to spawn the instance on the hypervisor.
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.116 182729 DEBUG nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.126 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.132 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.161 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:26:42 compute-0 podman[218893]: 2026-01-22 22:26:42.166350941 +0000 UTC m=+0.063488116 container create 87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:26:42 compute-0 systemd[1]: Started libpod-conmon-87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4.scope.
Jan 22 22:26:42 compute-0 podman[218893]: 2026-01-22 22:26:42.131685726 +0000 UTC m=+0.028822921 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.230 182729 DEBUG nova.network.neutron [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Updated VIF entry in instance network info cache for port a974255e-8591-4ca5-9d2b-5f038b1ff572. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.232 182729 DEBUG nova.network.neutron [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Updating instance_info_cache with network_info: [{"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:26:42 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:26:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ddd4256f7515be015ea69c868c9838141038a9ad6688e69fdab9205c6089f592/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:26:42 compute-0 podman[218893]: 2026-01-22 22:26:42.258849531 +0000 UTC m=+0.155986736 container init 87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:26:42 compute-0 podman[218893]: 2026-01-22 22:26:42.264310197 +0000 UTC m=+0.161447372 container start 87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.271 182729 DEBUG oslo_concurrency.lockutils [req-48e4c1a7-0e4e-4f76-8322-5b76024b6163 req-3e8607f9-cf19-4ca4-90af-a1a100425fe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5f855bce-32a6-4b71-b6fd-0647459a5872" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.285 182729 INFO nova.compute.manager [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Took 8.17 seconds to build instance.
Jan 22 22:26:42 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218908]: [NOTICE]   (218912) : New worker (218914) forked
Jan 22 22:26:42 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218908]: [NOTICE]   (218912) : Loading success.
Jan 22 22:26:42 compute-0 nova_compute[182725]: 2026-01-22 22:26:42.305 182729 DEBUG oslo_concurrency.lockutils [None req-2f8d479f-9786-4af1-8bd4-32176bd4afed 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.773 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.786 182729 DEBUG nova.objects.instance [None req-ce5bab9a-fb56-42b8-8d3c-59f14ed641cb 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f855bce-32a6-4b71-b6fd-0647459a5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.806 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120803.8053093, 5f855bce-32a6-4b71-b6fd-0647459a5872 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.807 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] VM Paused (Lifecycle Event)
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.836 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.840 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.857 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.874 182729 DEBUG nova.compute.manager [req-35d505e1-5dde-4761-89f5-e79890ee8198 req-0801c66f-7325-49ab-9e47-8c93a9718f9f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received event network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.875 182729 DEBUG oslo_concurrency.lockutils [req-35d505e1-5dde-4761-89f5-e79890ee8198 req-0801c66f-7325-49ab-9e47-8c93a9718f9f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.875 182729 DEBUG oslo_concurrency.lockutils [req-35d505e1-5dde-4761-89f5-e79890ee8198 req-0801c66f-7325-49ab-9e47-8c93a9718f9f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.875 182729 DEBUG oslo_concurrency.lockutils [req-35d505e1-5dde-4761-89f5-e79890ee8198 req-0801c66f-7325-49ab-9e47-8c93a9718f9f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.876 182729 DEBUG nova.compute.manager [req-35d505e1-5dde-4761-89f5-e79890ee8198 req-0801c66f-7325-49ab-9e47-8c93a9718f9f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] No waiting events found dispatching network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:26:43 compute-0 nova_compute[182725]: 2026-01-22 22:26:43.876 182729 WARNING nova.compute.manager [req-35d505e1-5dde-4761-89f5-e79890ee8198 req-0801c66f-7325-49ab-9e47-8c93a9718f9f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received unexpected event network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 for instance with vm_state active and task_state suspending.
Jan 22 22:26:44 compute-0 kernel: tapa974255e-85 (unregistering): left promiscuous mode
Jan 22 22:26:44 compute-0 NetworkManager[54954]: <info>  [1769120804.7351] device (tapa974255e-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:26:44 compute-0 ovn_controller[94850]: 2026-01-22T22:26:44Z|00171|binding|INFO|Releasing lport a974255e-8591-4ca5-9d2b-5f038b1ff572 from this chassis (sb_readonly=0)
Jan 22 22:26:44 compute-0 ovn_controller[94850]: 2026-01-22T22:26:44Z|00172|binding|INFO|Setting lport a974255e-8591-4ca5-9d2b-5f038b1ff572 down in Southbound
Jan 22 22:26:44 compute-0 nova_compute[182725]: 2026-01-22 22:26:44.751 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:44 compute-0 ovn_controller[94850]: 2026-01-22T22:26:44Z|00173|binding|INFO|Removing iface tapa974255e-85 ovn-installed in OVS
Jan 22 22:26:44 compute-0 nova_compute[182725]: 2026-01-22 22:26:44.753 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:44.762 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:79:ea 10.100.0.6'], port_security=['fa:16:3e:b8:79:ea 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5f855bce-32a6-4b71-b6fd-0647459a5872', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=a974255e-8591-4ca5-9d2b-5f038b1ff572) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:26:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:44.763 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a974255e-8591-4ca5-9d2b-5f038b1ff572 in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad unbound from our chassis
Jan 22 22:26:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:44.765 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:26:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:44.766 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5f777840-ff64-4411-904e-5ee1b33c223c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:44.767 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace which is not needed anymore
Jan 22 22:26:44 compute-0 nova_compute[182725]: 2026-01-22 22:26:44.768 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:44 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Jan 22 22:26:44 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003d.scope: Consumed 2.435s CPU time.
Jan 22 22:26:44 compute-0 systemd-machined[154006]: Machine qemu-25-instance-0000003d terminated.
Jan 22 22:26:44 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218908]: [NOTICE]   (218912) : haproxy version is 2.8.14-c23fe91
Jan 22 22:26:44 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218908]: [NOTICE]   (218912) : path to executable is /usr/sbin/haproxy
Jan 22 22:26:44 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218908]: [WARNING]  (218912) : Exiting Master process...
Jan 22 22:26:44 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218908]: [ALERT]    (218912) : Current worker (218914) exited with code 143 (Terminated)
Jan 22 22:26:44 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218908]: [WARNING]  (218912) : All workers exited. Exiting... (0)
Jan 22 22:26:44 compute-0 systemd[1]: libpod-87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4.scope: Deactivated successfully.
Jan 22 22:26:44 compute-0 podman[218950]: 2026-01-22 22:26:44.919398535 +0000 UTC m=+0.049629610 container died 87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:26:44 compute-0 nova_compute[182725]: 2026-01-22 22:26:44.976 182729 DEBUG nova.compute.manager [None req-ce5bab9a-fb56-42b8-8d3c-59f14ed641cb 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:26:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4-userdata-shm.mount: Deactivated successfully.
Jan 22 22:26:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-ddd4256f7515be015ea69c868c9838141038a9ad6688e69fdab9205c6089f592-merged.mount: Deactivated successfully.
Jan 22 22:26:45 compute-0 podman[218950]: 2026-01-22 22:26:45.030638263 +0000 UTC m=+0.160869298 container cleanup 87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:26:45 compute-0 systemd[1]: libpod-conmon-87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4.scope: Deactivated successfully.
Jan 22 22:26:45 compute-0 podman[218996]: 2026-01-22 22:26:45.186484043 +0000 UTC m=+0.131784571 container remove 87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.192 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[08e32e11-d958-4c05-bf45-429ed638d493]: (4, ('Thu Jan 22 10:26:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4)\n87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4\nThu Jan 22 10:26:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4)\n87bdf38094962d718cbeb0bb3786846fe4ccd6eeaa11921626c082c3b991acc4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.194 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[acf4c44c-4427-4d2f-97f5-f9fa1731b4ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.195 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:45 compute-0 nova_compute[182725]: 2026-01-22 22:26:45.197 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:45 compute-0 kernel: tap976277ea-60: left promiscuous mode
Jan 22 22:26:45 compute-0 nova_compute[182725]: 2026-01-22 22:26:45.208 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:45 compute-0 nova_compute[182725]: 2026-01-22 22:26:45.215 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.219 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5481b5c8-b4e3-4acc-9bbb-43ecd867dee8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.233 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[51a560d0-6da0-492b-ac5e-ae8e240c9c81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.234 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2929731d-7948-46e9-8a0c-18e44a0b3961]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.252 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[273ba384-40fd-485e-9edb-cfab1e88885f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443909, 'reachable_time': 18970, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219014, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.254 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.254 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ac1741-9e6a-4ed8-baab-6ea966fa0b5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:26:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d976277ea\x2d61b2\x2d4223\x2da8f7\x2d3d46bf9c98ad.mount: Deactivated successfully.
Jan 22 22:26:45 compute-0 nova_compute[182725]: 2026-01-22 22:26:45.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:45 compute-0 nova_compute[182725]: 2026-01-22 22:26:45.380 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.381 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:26:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:45.382 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.587 182729 DEBUG nova.compute.manager [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received event network-vif-unplugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.588 182729 DEBUG oslo_concurrency.lockutils [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.588 182729 DEBUG oslo_concurrency.lockutils [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.588 182729 DEBUG oslo_concurrency.lockutils [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.589 182729 DEBUG nova.compute.manager [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] No waiting events found dispatching network-vif-unplugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.589 182729 WARNING nova.compute.manager [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received unexpected event network-vif-unplugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 for instance with vm_state suspended and task_state None.
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.589 182729 DEBUG nova.compute.manager [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received event network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.590 182729 DEBUG oslo_concurrency.lockutils [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.590 182729 DEBUG oslo_concurrency.lockutils [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.590 182729 DEBUG oslo_concurrency.lockutils [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.591 182729 DEBUG nova.compute.manager [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] No waiting events found dispatching network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:26:46 compute-0 nova_compute[182725]: 2026-01-22 22:26:46.591 182729 WARNING nova.compute.manager [req-d1bb138a-0d14-4e58-8ef1-94242596f3a8 req-b1e59f98-8821-487f-b049-b16bc5501807 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received unexpected event network-vif-plugged-a974255e-8591-4ca5-9d2b-5f038b1ff572 for instance with vm_state suspended and task_state None.
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.308 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.309 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.310 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.310 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.311 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.328 182729 INFO nova.compute.manager [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Terminating instance
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.346 182729 DEBUG nova.compute.manager [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.356 182729 INFO nova.virt.libvirt.driver [-] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Instance destroyed successfully.
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.357 182729 DEBUG nova.objects.instance [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'resources' on Instance uuid 5f855bce-32a6-4b71-b6fd-0647459a5872 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.369 182729 DEBUG nova.virt.libvirt.vif [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:26:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-34337658',display_name='tempest-DeleteServersTestJSON-server-34337658',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-34337658',id=61,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:26:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-cb0kh5cm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:26:45Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=5f855bce-32a6-4b71-b6fd-0647459a5872,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.370 182729 DEBUG nova.network.os_vif_util [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "address": "fa:16:3e:b8:79:ea", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa974255e-85", "ovs_interfaceid": "a974255e-8591-4ca5-9d2b-5f038b1ff572", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.371 182729 DEBUG nova.network.os_vif_util [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:ea,bridge_name='br-int',has_traffic_filtering=True,id=a974255e-8591-4ca5-9d2b-5f038b1ff572,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa974255e-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.372 182729 DEBUG os_vif [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:ea,bridge_name='br-int',has_traffic_filtering=True,id=a974255e-8591-4ca5-9d2b-5f038b1ff572,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa974255e-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.375 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.375 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa974255e-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.378 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.381 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.385 182729 INFO os_vif [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:ea,bridge_name='br-int',has_traffic_filtering=True,id=a974255e-8591-4ca5-9d2b-5f038b1ff572,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa974255e-85')
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.386 182729 INFO nova.virt.libvirt.driver [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Deleting instance files /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872_del
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.388 182729 INFO nova.virt.libvirt.driver [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Deletion of /var/lib/nova/instances/5f855bce-32a6-4b71-b6fd-0647459a5872_del complete
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.499 182729 INFO nova.compute.manager [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Took 0.15 seconds to destroy the instance on the hypervisor.
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.501 182729 DEBUG oslo.service.loopingcall [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.501 182729 DEBUG nova.compute.manager [-] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:26:47 compute-0 nova_compute[182725]: 2026-01-22 22:26:47.501 182729 DEBUG nova.network.neutron [-] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:26:48 compute-0 nova_compute[182725]: 2026-01-22 22:26:48.776 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:48 compute-0 nova_compute[182725]: 2026-01-22 22:26:48.927 182729 DEBUG nova.network.neutron [-] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:26:48 compute-0 nova_compute[182725]: 2026-01-22 22:26:48.947 182729 INFO nova.compute.manager [-] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Took 1.45 seconds to deallocate network for instance.
Jan 22 22:26:49 compute-0 nova_compute[182725]: 2026-01-22 22:26:49.045 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:49 compute-0 nova_compute[182725]: 2026-01-22 22:26:49.046 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:49 compute-0 nova_compute[182725]: 2026-01-22 22:26:49.114 182729 DEBUG nova.compute.provider_tree [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:26:49 compute-0 nova_compute[182725]: 2026-01-22 22:26:49.130 182729 DEBUG nova.scheduler.client.report [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:26:49 compute-0 nova_compute[182725]: 2026-01-22 22:26:49.151 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:49 compute-0 podman[219016]: 2026-01-22 22:26:49.169622827 +0000 UTC m=+0.076390348 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:26:49 compute-0 nova_compute[182725]: 2026-01-22 22:26:49.175 182729 INFO nova.scheduler.client.report [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Deleted allocations for instance 5f855bce-32a6-4b71-b6fd-0647459a5872
Jan 22 22:26:49 compute-0 podman[219015]: 2026-01-22 22:26:49.177655388 +0000 UTC m=+0.098115821 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:26:49 compute-0 nova_compute[182725]: 2026-01-22 22:26:49.264 182729 DEBUG oslo_concurrency.lockutils [None req-1908592e-5001-4489-9e22-2ce484e76400 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5f855bce-32a6-4b71-b6fd-0647459a5872" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:26:50.383 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:26:51 compute-0 nova_compute[182725]: 2026-01-22 22:26:51.104 182729 DEBUG nova.compute.manager [req-cff462d0-68d9-4b55-a8d6-aefb76e17091 req-555aaf8a-f770-4e75-a176-e477241db1e8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Received event network-vif-deleted-a974255e-8591-4ca5-9d2b-5f038b1ff572 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:26:52 compute-0 nova_compute[182725]: 2026-01-22 22:26:52.380 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:53 compute-0 podman[219059]: 2026-01-22 22:26:53.138349492 +0000 UTC m=+0.074194684 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:26:53 compute-0 nova_compute[182725]: 2026-01-22 22:26:53.778 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:54 compute-0 nova_compute[182725]: 2026-01-22 22:26:54.671 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:54 compute-0 nova_compute[182725]: 2026-01-22 22:26:54.671 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:54 compute-0 nova_compute[182725]: 2026-01-22 22:26:54.690 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:26:54 compute-0 nova_compute[182725]: 2026-01-22 22:26:54.823 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:54 compute-0 nova_compute[182725]: 2026-01-22 22:26:54.824 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:54 compute-0 nova_compute[182725]: 2026-01-22 22:26:54.834 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:26:54 compute-0 nova_compute[182725]: 2026-01-22 22:26:54.834 182729 INFO nova.compute.claims [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.005 182729 DEBUG nova.compute.provider_tree [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.032 182729 DEBUG nova.scheduler.client.report [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.071 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.072 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.147 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.148 182729 DEBUG nova.network.neutron [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.174 182729 INFO nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.196 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.341 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.343 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.343 182729 INFO nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Creating image(s)
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.344 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.344 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.345 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.358 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.453 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.455 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.457 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.477 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.539 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.541 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:55 compute-0 nova_compute[182725]: 2026-01-22 22:26:55.572 182729 DEBUG nova.policy [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95cf9999380d48108a561554c1897f15', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.371 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk 1073741824" returned: 0 in 0.830s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.372 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.373 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.436 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.438 182729 DEBUG nova.virt.disk.api [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Checking if we can resize image /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.438 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.494 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.495 182729 DEBUG nova.virt.disk.api [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Cannot resize image /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.496 182729 DEBUG nova.objects.instance [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'migration_context' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.514 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.514 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Ensure instance console log exists: /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.515 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.515 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.515 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:26:56 compute-0 nova_compute[182725]: 2026-01-22 22:26:56.624 182729 DEBUG nova.network.neutron [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Successfully created port: 6e2e52bd-d176-426a-9062-001bb36b8ada _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:26:57 compute-0 nova_compute[182725]: 2026-01-22 22:26:57.383 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.762 182729 DEBUG nova.network.neutron [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Successfully updated port: 6e2e52bd-d176-426a-9062-001bb36b8ada _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.781 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.798 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.798 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.799 182729 DEBUG nova.network.neutron [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.923 182729 DEBUG nova.compute.manager [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-changed-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.924 182729 DEBUG nova.compute.manager [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Refreshing instance network info cache due to event network-changed-6e2e52bd-d176-426a-9062-001bb36b8ada. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:26:58 compute-0 nova_compute[182725]: 2026-01-22 22:26:58.925 182729 DEBUG oslo_concurrency.lockutils [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:26:59 compute-0 nova_compute[182725]: 2026-01-22 22:26:59.146 182729 DEBUG nova.network.neutron [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:26:59 compute-0 nova_compute[182725]: 2026-01-22 22:26:59.980 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120804.976612, 5f855bce-32a6-4b71-b6fd-0647459a5872 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:26:59 compute-0 nova_compute[182725]: 2026-01-22 22:26:59.981 182729 INFO nova.compute.manager [-] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] VM Stopped (Lifecycle Event)
Jan 22 22:27:00 compute-0 nova_compute[182725]: 2026-01-22 22:27:00.055 182729 DEBUG nova.compute.manager [None req-c24ee763-059c-49b2-9c19-565323844142 - - - - - -] [instance: 5f855bce-32a6-4b71-b6fd-0647459a5872] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.298 182729 DEBUG nova.network.neutron [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating instance_info_cache with network_info: [{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.327 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.327 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance network_info: |[{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.328 182729 DEBUG oslo_concurrency.lockutils [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.328 182729 DEBUG nova.network.neutron [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Refreshing network info cache for port 6e2e52bd-d176-426a-9062-001bb36b8ada _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.332 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Start _get_guest_xml network_info=[{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.337 182729 WARNING nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.342 182729 DEBUG nova.virt.libvirt.host [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.343 182729 DEBUG nova.virt.libvirt.host [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.352 182729 DEBUG nova.virt.libvirt.host [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.353 182729 DEBUG nova.virt.libvirt.host [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.355 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.355 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.356 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.356 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.356 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.356 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.357 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.357 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.357 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.357 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.358 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.358 182729 DEBUG nova.virt.hardware [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.362 182729 DEBUG nova.virt.libvirt.vif [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-240266002',display_name='tempest-DeleteServersTestJSON-server-240266002',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-240266002',id=63,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-s4xtmr4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:55Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a257831f-dd8b-4af2-afa1-e7cc926e23cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.363 182729 DEBUG nova.network.os_vif_util [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.363 182729 DEBUG nova.network.os_vif_util [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.365 182729 DEBUG nova.objects.instance [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'pci_devices' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.383 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <uuid>a257831f-dd8b-4af2-afa1-e7cc926e23cf</uuid>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <name>instance-0000003f</name>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <nova:name>tempest-DeleteServersTestJSON-server-240266002</nova:name>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:27:01</nova:creationTime>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:user uuid="95cf9999380d48108a561554c1897f15">tempest-DeleteServersTestJSON-1655437746-project-member</nova:user>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:project uuid="9f8f780ce45a4950b1666a54cd9a5ba0">tempest-DeleteServersTestJSON-1655437746</nova:project>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         <nova:port uuid="6e2e52bd-d176-426a-9062-001bb36b8ada">
Jan 22 22:27:01 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <system>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <entry name="serial">a257831f-dd8b-4af2-afa1-e7cc926e23cf</entry>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <entry name="uuid">a257831f-dd8b-4af2-afa1-e7cc926e23cf</entry>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </system>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <os>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   </os>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <features>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   </features>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:bb:e2:5d"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <target dev="tap6e2e52bd-d1"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/console.log" append="off"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <video>
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </video>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:27:01 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:27:01 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:27:01 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:27:01 compute-0 nova_compute[182725]: </domain>
Jan 22 22:27:01 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.384 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Preparing to wait for external event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.384 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.385 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.385 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.386 182729 DEBUG nova.virt.libvirt.vif [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-240266002',display_name='tempest-DeleteServersTestJSON-server-240266002',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-240266002',id=63,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-s4xtmr4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:55Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a257831f-dd8b-4af2-afa1-e7cc926e23cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.386 182729 DEBUG nova.network.os_vif_util [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.387 182729 DEBUG nova.network.os_vif_util [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.387 182729 DEBUG os_vif [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.388 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.388 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.389 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.392 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.393 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e2e52bd-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.393 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e2e52bd-d1, col_values=(('external_ids', {'iface-id': '6e2e52bd-d176-426a-9062-001bb36b8ada', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:e2:5d', 'vm-uuid': 'a257831f-dd8b-4af2-afa1-e7cc926e23cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.394 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:01 compute-0 NetworkManager[54954]: <info>  [1769120821.3960] manager: (tap6e2e52bd-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.397 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.400 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.401 182729 INFO os_vif [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1')
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.460 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.461 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.461 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No VIF found with MAC fa:16:3e:bb:e2:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:27:01 compute-0 nova_compute[182725]: 2026-01-22 22:27:01.462 182729 INFO nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Using config drive
Jan 22 22:27:02 compute-0 nova_compute[182725]: 2026-01-22 22:27:02.420 182729 INFO nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Creating config drive at /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config
Jan 22 22:27:02 compute-0 nova_compute[182725]: 2026-01-22 22:27:02.426 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxiazq6w3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:02 compute-0 nova_compute[182725]: 2026-01-22 22:27:02.567 182729 DEBUG oslo_concurrency.processutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxiazq6w3" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:02 compute-0 kernel: tap6e2e52bd-d1: entered promiscuous mode
Jan 22 22:27:02 compute-0 nova_compute[182725]: 2026-01-22 22:27:02.655 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:02 compute-0 ovn_controller[94850]: 2026-01-22T22:27:02Z|00174|binding|INFO|Claiming lport 6e2e52bd-d176-426a-9062-001bb36b8ada for this chassis.
Jan 22 22:27:02 compute-0 ovn_controller[94850]: 2026-01-22T22:27:02Z|00175|binding|INFO|6e2e52bd-d176-426a-9062-001bb36b8ada: Claiming fa:16:3e:bb:e2:5d 10.100.0.9
Jan 22 22:27:02 compute-0 NetworkManager[54954]: <info>  [1769120822.6586] manager: (tap6e2e52bd-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.669 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:e2:5d 10.100.0.9'], port_security=['fa:16:3e:bb:e2:5d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a257831f-dd8b-4af2-afa1-e7cc926e23cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=6e2e52bd-d176-426a-9062-001bb36b8ada) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.671 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 6e2e52bd-d176-426a-9062-001bb36b8ada in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad bound to our chassis
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.674 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:27:02 compute-0 nova_compute[182725]: 2026-01-22 22:27:02.684 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:02 compute-0 ovn_controller[94850]: 2026-01-22T22:27:02Z|00176|binding|INFO|Setting lport 6e2e52bd-d176-426a-9062-001bb36b8ada ovn-installed in OVS
Jan 22 22:27:02 compute-0 ovn_controller[94850]: 2026-01-22T22:27:02Z|00177|binding|INFO|Setting lport 6e2e52bd-d176-426a-9062-001bb36b8ada up in Southbound
Jan 22 22:27:02 compute-0 systemd-udevd[219117]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:27:02 compute-0 nova_compute[182725]: 2026-01-22 22:27:02.693 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.696 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7bd69d-57ab-4b4e-9ee6-74634029ff8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.697 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap976277ea-61 in ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.705 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap976277ea-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.705 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f71426-91ad-4980-971d-842e546c7447]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.706 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6f38cb20-1832-49cb-8773-9b89912298d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 NetworkManager[54954]: <info>  [1769120822.7080] device (tap6e2e52bd-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:27:02 compute-0 NetworkManager[54954]: <info>  [1769120822.7098] device (tap6e2e52bd-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:27:02 compute-0 systemd-machined[154006]: New machine qemu-26-instance-0000003f.
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.728 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc4003f-b4b2-4769-8d33-36e3c5a3d8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-0000003f.
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.760 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5918ef22-51d4-4826-8c00-aa8c23e875eb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.803 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[aa50ec31-76f3-43c4-a060-72004f600c72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 NetworkManager[54954]: <info>  [1769120822.8124] manager: (tap976277ea-60): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.811 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7f28eeb7-916a-4ef6-922b-6b76e576b76d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.861 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[071185ea-cf01-49af-937a-1f472faa3c22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.866 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c604c716-d38a-45ce-8c74-3262191b52f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 NetworkManager[54954]: <info>  [1769120822.8917] device (tap976277ea-60): carrier: link connected
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.897 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[173c73cd-05d9-44e1-8811-4cc06bee1eb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.915 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b3beaaf1-503e-45d8-a183-b4e6a8f47033]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446049, 'reachable_time': 21685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219152, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.929 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b1acc311-0a9f-4561-aa5a-b5a6931df74e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:95d6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446049, 'tstamp': 446049}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219153, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:02.958 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5036a446-a3ed-473d-ae98-b4d708f58253]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446049, 'reachable_time': 21685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219154, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.008 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[331853cc-5089-446e-871c-45d923f38e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.096 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3676eda2-4499-429c-91e3-62bb90d6ec34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.098 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.099 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.100 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap976277ea-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:03 compute-0 kernel: tap976277ea-60: entered promiscuous mode
Jan 22 22:27:03 compute-0 NetworkManager[54954]: <info>  [1769120823.1036] manager: (tap976277ea-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.104 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.110 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap976277ea-60, col_values=(('external_ids', {'iface-id': '06db452d-91a0-4ebb-b584-a57953634a03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:03 compute-0 ovn_controller[94850]: 2026-01-22T22:27:03Z|00178|binding|INFO|Releasing lport 06db452d-91a0-4ebb-b584-a57953634a03 from this chassis (sb_readonly=0)
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.112 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.113 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.115 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f40e6f-2061-45c3-a3b9-94d45e16d14b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.116 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:27:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:03.117 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'env', 'PROCESS_TAG=haproxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/976277ea-61b2-4223-a8f7-3d46bf9c98ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.126 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.213 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120823.2125788, a257831f-dd8b-4af2-afa1-e7cc926e23cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.214 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] VM Started (Lifecycle Event)
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.235 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.241 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120823.213655, a257831f-dd8b-4af2-afa1-e7cc926e23cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.241 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] VM Paused (Lifecycle Event)
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.266 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.272 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.302 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.423 182729 DEBUG nova.network.neutron [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updated VIF entry in instance network info cache for port 6e2e52bd-d176-426a-9062-001bb36b8ada. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.425 182729 DEBUG nova.network.neutron [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating instance_info_cache with network_info: [{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.441 182729 DEBUG oslo_concurrency.lockutils [req-a1469b4e-98f2-42f8-8994-8ed1c83da904 req-043a40ed-3f72-48f1-b1bf-3f6cb6de0010 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:27:03 compute-0 podman[219193]: 2026-01-22 22:27:03.542014873 +0000 UTC m=+0.053443895 container create 548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:27:03 compute-0 systemd[1]: Started libpod-conmon-548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5.scope.
Jan 22 22:27:03 compute-0 podman[219193]: 2026-01-22 22:27:03.51584297 +0000 UTC m=+0.027272022 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:27:03 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:27:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25ebba1ea5a84f7e46db22c09275351c7a20df02d8833b4affa6e53246946e8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:27:03 compute-0 podman[219193]: 2026-01-22 22:27:03.632570914 +0000 UTC m=+0.143999986 container init 548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:27:03 compute-0 podman[219193]: 2026-01-22 22:27:03.641335003 +0000 UTC m=+0.152764035 container start 548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:27:03 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [NOTICE]   (219212) : New worker (219214) forked
Jan 22 22:27:03 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [NOTICE]   (219212) : Loading success.
Jan 22 22:27:03 compute-0 nova_compute[182725]: 2026-01-22 22:27:03.784 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.111 182729 DEBUG nova.compute.manager [req-ac127bea-d74e-4ca5-a18f-364c99fddb18 req-3bd1f2f3-910b-422f-9911-e9b863f7b40e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.112 182729 DEBUG oslo_concurrency.lockutils [req-ac127bea-d74e-4ca5-a18f-364c99fddb18 req-3bd1f2f3-910b-422f-9911-e9b863f7b40e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.112 182729 DEBUG oslo_concurrency.lockutils [req-ac127bea-d74e-4ca5-a18f-364c99fddb18 req-3bd1f2f3-910b-422f-9911-e9b863f7b40e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.112 182729 DEBUG oslo_concurrency.lockutils [req-ac127bea-d74e-4ca5-a18f-364c99fddb18 req-3bd1f2f3-910b-422f-9911-e9b863f7b40e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.112 182729 DEBUG nova.compute.manager [req-ac127bea-d74e-4ca5-a18f-364c99fddb18 req-3bd1f2f3-910b-422f-9911-e9b863f7b40e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Processing event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.113 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.118 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120825.117841, a257831f-dd8b-4af2-afa1-e7cc926e23cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.118 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] VM Resumed (Lifecycle Event)
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.121 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.124 182729 INFO nova.virt.libvirt.driver [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance spawned successfully.
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.125 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.140 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.146 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.149 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.150 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.151 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.151 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.152 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.152 182729 DEBUG nova.virt.libvirt.driver [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.192 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.228 182729 INFO nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Took 9.89 seconds to spawn the instance on the hypervisor.
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.229 182729 DEBUG nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.309 182729 INFO nova.compute.manager [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Took 10.55 seconds to build instance.
Jan 22 22:27:05 compute-0 nova_compute[182725]: 2026-01-22 22:27:05.329 182729 DEBUG oslo_concurrency.lockutils [None req-9f219d2f-2f5f-4d72-ab99-b372a0dc7a07 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:06 compute-0 nova_compute[182725]: 2026-01-22 22:27:06.395 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:07 compute-0 nova_compute[182725]: 2026-01-22 22:27:07.235 182729 DEBUG nova.compute.manager [req-a87b4fd7-f850-4c54-9a3a-c7c3927c6757 req-5f27b0e3-e6c6-4ad9-9acf-b72a973fc372 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:07 compute-0 nova_compute[182725]: 2026-01-22 22:27:07.236 182729 DEBUG oslo_concurrency.lockutils [req-a87b4fd7-f850-4c54-9a3a-c7c3927c6757 req-5f27b0e3-e6c6-4ad9-9acf-b72a973fc372 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:07 compute-0 nova_compute[182725]: 2026-01-22 22:27:07.236 182729 DEBUG oslo_concurrency.lockutils [req-a87b4fd7-f850-4c54-9a3a-c7c3927c6757 req-5f27b0e3-e6c6-4ad9-9acf-b72a973fc372 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:07 compute-0 nova_compute[182725]: 2026-01-22 22:27:07.236 182729 DEBUG oslo_concurrency.lockutils [req-a87b4fd7-f850-4c54-9a3a-c7c3927c6757 req-5f27b0e3-e6c6-4ad9-9acf-b72a973fc372 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:07 compute-0 nova_compute[182725]: 2026-01-22 22:27:07.236 182729 DEBUG nova.compute.manager [req-a87b4fd7-f850-4c54-9a3a-c7c3927c6757 req-5f27b0e3-e6c6-4ad9-9acf-b72a973fc372 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] No waiting events found dispatching network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:27:07 compute-0 nova_compute[182725]: 2026-01-22 22:27:07.237 182729 WARNING nova.compute.manager [req-a87b4fd7-f850-4c54-9a3a-c7c3927c6757 req-5f27b0e3-e6c6-4ad9-9acf-b72a973fc372 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received unexpected event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada for instance with vm_state active and task_state None.
Jan 22 22:27:08 compute-0 podman[219223]: 2026-01-22 22:27:08.158071759 +0000 UTC m=+0.080002488 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:27:08 compute-0 nova_compute[182725]: 2026-01-22 22:27:08.786 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:10 compute-0 podman[219245]: 2026-01-22 22:27:10.133105738 +0000 UTC m=+0.062488981 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, distribution-scope=public)
Jan 22 22:27:10 compute-0 podman[219244]: 2026-01-22 22:27:10.206334246 +0000 UTC m=+0.133971425 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.409 182729 DEBUG nova.compute.manager [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.506 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.507 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.526 182729 DEBUG nova.objects.instance [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'pci_requests' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.538 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.538 182729 INFO nova.compute.claims [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.539 182729 DEBUG nova.objects.instance [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'resources' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.548 182729 DEBUG nova.objects.instance [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'pci_devices' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.581 182729 INFO nova.compute.resource_tracker [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating resource usage from migration aac9dbe7-76f6-46df-ae50-ed1da11d6e54
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.630 182729 DEBUG nova.compute.provider_tree [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.648 182729 DEBUG nova.scheduler.client.report [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.674 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.675 182729 INFO nova.compute.manager [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Migrating
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.718 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.719 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:27:10 compute-0 nova_compute[182725]: 2026-01-22 22:27:10.719 182729 DEBUG nova.network.neutron [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:27:11 compute-0 nova_compute[182725]: 2026-01-22 22:27:11.399 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:11 compute-0 nova_compute[182725]: 2026-01-22 22:27:11.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:11 compute-0 nova_compute[182725]: 2026-01-22 22:27:11.909 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:11 compute-0 nova_compute[182725]: 2026-01-22 22:27:11.910 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:11 compute-0 nova_compute[182725]: 2026-01-22 22:27:11.910 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:11 compute-0 nova_compute[182725]: 2026-01-22 22:27:11.910 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:27:11 compute-0 nova_compute[182725]: 2026-01-22 22:27:11.989 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.090 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.091 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.114 182729 DEBUG nova.network.neutron [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating instance_info_cache with network_info: [{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.134 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.160 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.254 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.256 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.312 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.313 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5472MB free_disk=73.37609481811523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.313 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.314 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.378 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Applying migration context for instance a257831f-dd8b-4af2-afa1-e7cc926e23cf as it has an incoming, in-progress migration aac9dbe7-76f6-46df-ae50-ed1da11d6e54. Migration status is migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.379 182729 INFO nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating resource usage from migration aac9dbe7-76f6-46df-ae50-ed1da11d6e54
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.396 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Migration aac9dbe7-76f6-46df-ae50-ed1da11d6e54 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.397 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance a257831f-dd8b-4af2-afa1-e7cc926e23cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.397 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.397 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:27:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:12.435 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:12.436 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:12.436 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.465 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.480 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.507 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:27:12 compute-0 nova_compute[182725]: 2026-01-22 22:27:12.508 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.504 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.788 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.908 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.908 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.908 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:27:13 compute-0 nova_compute[182725]: 2026-01-22 22:27:13.909 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:15 compute-0 nova_compute[182725]: 2026-01-22 22:27:15.495 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating instance_info_cache with network_info: [{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:27:15 compute-0 nova_compute[182725]: 2026-01-22 22:27:15.511 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:27:15 compute-0 nova_compute[182725]: 2026-01-22 22:27:15.512 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:27:15 compute-0 nova_compute[182725]: 2026-01-22 22:27:15.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:16 compute-0 nova_compute[182725]: 2026-01-22 22:27:16.402 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:16 compute-0 nova_compute[182725]: 2026-01-22 22:27:16.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:16 compute-0 nova_compute[182725]: 2026-01-22 22:27:16.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:16 compute-0 nova_compute[182725]: 2026-01-22 22:27:16.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:27:17 compute-0 ovn_controller[94850]: 2026-01-22T22:27:17Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bb:e2:5d 10.100.0.9
Jan 22 22:27:17 compute-0 ovn_controller[94850]: 2026-01-22T22:27:17Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bb:e2:5d 10.100.0.9
Jan 22 22:27:18 compute-0 nova_compute[182725]: 2026-01-22 22:27:18.791 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:18 compute-0 nova_compute[182725]: 2026-01-22 22:27:18.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:20 compute-0 podman[219311]: 2026-01-22 22:27:20.124498057 +0000 UTC m=+0.051380693 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:27:20 compute-0 podman[219310]: 2026-01-22 22:27:20.14702943 +0000 UTC m=+0.076566293 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 22:27:20 compute-0 nova_compute[182725]: 2026-01-22 22:27:20.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:21 compute-0 nova_compute[182725]: 2026-01-22 22:27:21.406 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:21 compute-0 nova_compute[182725]: 2026-01-22 22:27:21.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:22 compute-0 nova_compute[182725]: 2026-01-22 22:27:22.300 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 22:27:23 compute-0 nova_compute[182725]: 2026-01-22 22:27:23.794 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:23 compute-0 nova_compute[182725]: 2026-01-22 22:27:23.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:27:24 compute-0 podman[219350]: 2026-01-22 22:27:24.118376241 +0000 UTC m=+0.052163583 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:27:24 compute-0 kernel: tap6e2e52bd-d1 (unregistering): left promiscuous mode
Jan 22 22:27:24 compute-0 NetworkManager[54954]: <info>  [1769120844.5217] device (tap6e2e52bd-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:27:24 compute-0 nova_compute[182725]: 2026-01-22 22:27:24.532 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:24 compute-0 ovn_controller[94850]: 2026-01-22T22:27:24Z|00179|binding|INFO|Releasing lport 6e2e52bd-d176-426a-9062-001bb36b8ada from this chassis (sb_readonly=0)
Jan 22 22:27:24 compute-0 ovn_controller[94850]: 2026-01-22T22:27:24Z|00180|binding|INFO|Setting lport 6e2e52bd-d176-426a-9062-001bb36b8ada down in Southbound
Jan 22 22:27:24 compute-0 ovn_controller[94850]: 2026-01-22T22:27:24Z|00181|binding|INFO|Removing iface tap6e2e52bd-d1 ovn-installed in OVS
Jan 22 22:27:24 compute-0 nova_compute[182725]: 2026-01-22 22:27:24.534 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:24 compute-0 nova_compute[182725]: 2026-01-22 22:27:24.555 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:24 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Jan 22 22:27:24 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Consumed 13.219s CPU time.
Jan 22 22:27:24 compute-0 systemd-machined[154006]: Machine qemu-26-instance-0000003f terminated.
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.624 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:e2:5d 10.100.0.9'], port_security=['fa:16:3e:bb:e2:5d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a257831f-dd8b-4af2-afa1-e7cc926e23cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=6e2e52bd-d176-426a-9062-001bb36b8ada) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.625 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 6e2e52bd-d176-426a-9062-001bb36b8ada in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad unbound from our chassis
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.627 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.628 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94199e8e-43aa-473e-be8e-d04e0c87268d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.629 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace which is not needed anymore
Jan 22 22:27:24 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [NOTICE]   (219212) : haproxy version is 2.8.14-c23fe91
Jan 22 22:27:24 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [NOTICE]   (219212) : path to executable is /usr/sbin/haproxy
Jan 22 22:27:24 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [WARNING]  (219212) : Exiting Master process...
Jan 22 22:27:24 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [WARNING]  (219212) : Exiting Master process...
Jan 22 22:27:24 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [ALERT]    (219212) : Current worker (219214) exited with code 143 (Terminated)
Jan 22 22:27:24 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219208]: [WARNING]  (219212) : All workers exited. Exiting... (0)
Jan 22 22:27:24 compute-0 systemd[1]: libpod-548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5.scope: Deactivated successfully.
Jan 22 22:27:24 compute-0 podman[219397]: 2026-01-22 22:27:24.80724774 +0000 UTC m=+0.067751953 container died 548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:27:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5-userdata-shm.mount: Deactivated successfully.
Jan 22 22:27:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-25ebba1ea5a84f7e46db22c09275351c7a20df02d8833b4affa6e53246946e8e-merged.mount: Deactivated successfully.
Jan 22 22:27:24 compute-0 podman[219397]: 2026-01-22 22:27:24.845900305 +0000 UTC m=+0.106404508 container cleanup 548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:27:24 compute-0 systemd[1]: libpod-conmon-548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5.scope: Deactivated successfully.
Jan 22 22:27:24 compute-0 podman[219444]: 2026-01-22 22:27:24.909283798 +0000 UTC m=+0.037047716 container remove 548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.914 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[53fd20c4-b507-42be-9442-3220d6ba999e]: (4, ('Thu Jan 22 10:27:24 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5)\n548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5\nThu Jan 22 10:27:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5)\n548708227d67c600ae977891faa0ac3a5a5899afec852fc04f4e87e2beaf88a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.917 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[33328d10-7403-415d-8076-19b41cf1bdb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.918 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:24 compute-0 nova_compute[182725]: 2026-01-22 22:27:24.920 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:24 compute-0 kernel: tap976277ea-60: left promiscuous mode
Jan 22 22:27:24 compute-0 nova_compute[182725]: 2026-01-22 22:27:24.934 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.939 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9b67aa3f-4988-461b-b38e-beea66db6e15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.951 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b11dfa-8a48-41cb-b918-722efa194460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.952 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc8877a-c5fa-4668-862f-cb2a83554c2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.967 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a5028d-d044-4fdc-84ef-a06a636212f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446039, 'reachable_time': 30390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219461, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d976277ea\x2d61b2\x2d4223\x2da8f7\x2d3d46bf9c98ad.mount: Deactivated successfully.
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.970 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:27:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:24.970 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[ac98b56c-7c9e-497a-8136-ce87ae2bab0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.318 182729 INFO nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance shutdown successfully after 13 seconds.
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.323 182729 INFO nova.virt.libvirt.driver [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance destroyed successfully.
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.324 182729 DEBUG nova.virt.libvirt.vif [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-240266002',display_name='tempest-DeleteServersTestJSON-server-240266002',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-240266002',id=63,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-s4xtmr4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:27:10Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a257831f-dd8b-4af2-afa1-e7cc926e23cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-311555490-network", "vif_mac": "fa:16:3e:bb:e2:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.325 182729 DEBUG nova.network.os_vif_util [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-311555490-network", "vif_mac": "fa:16:3e:bb:e2:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.326 182729 DEBUG nova.network.os_vif_util [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.327 182729 DEBUG os_vif [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.329 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.330 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e2e52bd-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.332 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.333 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.334 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.337 182729 INFO os_vif [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1')
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.341 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.407 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.409 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.484 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.486 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_resize/disk /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.522 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "cp -r /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_resize/disk /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.523 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_resize/disk.config /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.556 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "cp -r /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_resize/disk.config /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.557 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_resize/disk.info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.590 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "cp -r /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_resize/disk.info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.800 182729 DEBUG nova.network.neutron [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Port 6e2e52bd-d176-426a-9062-001bb36b8ada binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.936 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.937 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:25 compute-0 nova_compute[182725]: 2026-01-22 22:27:25.937 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.043 182729 DEBUG nova.compute.manager [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-unplugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.044 182729 DEBUG oslo_concurrency.lockutils [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.044 182729 DEBUG oslo_concurrency.lockutils [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.044 182729 DEBUG oslo_concurrency.lockutils [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.045 182729 DEBUG nova.compute.manager [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] No waiting events found dispatching network-vif-unplugged-6e2e52bd-d176-426a-9062-001bb36b8ada pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.045 182729 WARNING nova.compute.manager [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received unexpected event network-vif-unplugged-6e2e52bd-d176-426a-9062-001bb36b8ada for instance with vm_state active and task_state resize_migrated.
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.045 182729 DEBUG nova.compute.manager [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.045 182729 DEBUG oslo_concurrency.lockutils [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.046 182729 DEBUG oslo_concurrency.lockutils [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.046 182729 DEBUG oslo_concurrency.lockutils [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.046 182729 DEBUG nova.compute.manager [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] No waiting events found dispatching network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.047 182729 WARNING nova.compute.manager [req-b2cc1277-65c1-41a8-a3f5-c015bb68b814 req-75e8273f-e98b-4ef9-9595-c1b9ee6f97f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received unexpected event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada for instance with vm_state active and task_state resize_migrated.
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.150 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.150 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:27:26 compute-0 nova_compute[182725]: 2026-01-22 22:27:26.151 182729 DEBUG nova.network.neutron [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.397 182729 DEBUG nova.network.neutron [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating instance_info_cache with network_info: [{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.422 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.546 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.547 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.548 182729 INFO nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Creating image(s)
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.549 182729 DEBUG nova.objects.instance [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.562 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.627 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.628 182729 DEBUG nova.virt.disk.api [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Checking if we can resize image /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.628 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.687 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.688 182729 DEBUG nova.virt.disk.api [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Cannot resize image /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.713 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.713 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Ensure instance console log exists: /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.714 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.714 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.714 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.716 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Start _get_guest_xml network_info=[{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-311555490-network", "vif_mac": "fa:16:3e:bb:e2:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.722 182729 WARNING nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.733 182729 DEBUG nova.virt.libvirt.host [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.734 182729 DEBUG nova.virt.libvirt.host [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.737 182729 DEBUG nova.virt.libvirt.host [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.738 182729 DEBUG nova.virt.libvirt.host [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.739 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.739 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='617fb2f8-2c15-4939-a64a-90fca4acd12a',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.740 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.740 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.740 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.740 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.741 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.741 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.741 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.741 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.741 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.742 182729 DEBUG nova.virt.hardware [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.742 182729 DEBUG nova.objects.instance [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.762 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.826 182729 DEBUG oslo_concurrency.processutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.827 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.827 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.828 182729 DEBUG oslo_concurrency.lockutils [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.829 182729 DEBUG nova.virt.libvirt.vif [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-240266002',display_name='tempest-DeleteServersTestJSON-server-240266002',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-240266002',id=63,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-s4xtmr4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:27:25Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a257831f-dd8b-4af2-afa1-e7cc926e23cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-311555490-network", "vif_mac": "fa:16:3e:bb:e2:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.829 182729 DEBUG nova.network.os_vif_util [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-311555490-network", "vif_mac": "fa:16:3e:bb:e2:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.830 182729 DEBUG nova.network.os_vif_util [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.833 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <uuid>a257831f-dd8b-4af2-afa1-e7cc926e23cf</uuid>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <name>instance-0000003f</name>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <memory>196608</memory>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <nova:name>tempest-DeleteServersTestJSON-server-240266002</nova:name>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:27:27</nova:creationTime>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <nova:flavor name="m1.micro">
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:memory>192</nova:memory>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:user uuid="95cf9999380d48108a561554c1897f15">tempest-DeleteServersTestJSON-1655437746-project-member</nova:user>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:project uuid="9f8f780ce45a4950b1666a54cd9a5ba0">tempest-DeleteServersTestJSON-1655437746</nova:project>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         <nova:port uuid="6e2e52bd-d176-426a-9062-001bb36b8ada">
Jan 22 22:27:27 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <system>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <entry name="serial">a257831f-dd8b-4af2-afa1-e7cc926e23cf</entry>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <entry name="uuid">a257831f-dd8b-4af2-afa1-e7cc926e23cf</entry>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </system>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <os>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   </os>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <features>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   </features>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/disk.config"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:bb:e2:5d"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <target dev="tap6e2e52bd-d1"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf/console.log" append="off"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <video>
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </video>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:27:27 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:27:27 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:27:27 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:27:27 compute-0 nova_compute[182725]: </domain>
Jan 22 22:27:27 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.834 182729 DEBUG nova.virt.libvirt.vif [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-240266002',display_name='tempest-DeleteServersTestJSON-server-240266002',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-240266002',id=63,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-s4xtmr4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:27:25Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a257831f-dd8b-4af2-afa1-e7cc926e23cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-311555490-network", "vif_mac": "fa:16:3e:bb:e2:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.834 182729 DEBUG nova.network.os_vif_util [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-311555490-network", "vif_mac": "fa:16:3e:bb:e2:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.834 182729 DEBUG nova.network.os_vif_util [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.835 182729 DEBUG os_vif [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.836 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.836 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.837 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.839 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.840 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e2e52bd-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.840 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e2e52bd-d1, col_values=(('external_ids', {'iface-id': '6e2e52bd-d176-426a-9062-001bb36b8ada', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:e2:5d', 'vm-uuid': 'a257831f-dd8b-4af2-afa1-e7cc926e23cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:27 compute-0 NetworkManager[54954]: <info>  [1769120847.8437] manager: (tap6e2e52bd-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.845 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.850 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.852 182729 INFO os_vif [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1')
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.915 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.915 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.916 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No VIF found with MAC fa:16:3e:bb:e2:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.916 182729 INFO nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Using config drive
Jan 22 22:27:27 compute-0 kernel: tap6e2e52bd-d1: entered promiscuous mode
Jan 22 22:27:27 compute-0 NetworkManager[54954]: <info>  [1769120847.9831] manager: (tap6e2e52bd-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Jan 22 22:27:27 compute-0 ovn_controller[94850]: 2026-01-22T22:27:27Z|00182|binding|INFO|Claiming lport 6e2e52bd-d176-426a-9062-001bb36b8ada for this chassis.
Jan 22 22:27:27 compute-0 ovn_controller[94850]: 2026-01-22T22:27:27Z|00183|binding|INFO|6e2e52bd-d176-426a-9062-001bb36b8ada: Claiming fa:16:3e:bb:e2:5d 10.100.0.9
Jan 22 22:27:27 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.985 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:27.993 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:e2:5d 10.100.0.9'], port_security=['fa:16:3e:bb:e2:5d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a257831f-dd8b-4af2-afa1-e7cc926e23cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=6e2e52bd-d176-426a-9062-001bb36b8ada) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:27:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:27.994 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 6e2e52bd-d176-426a-9062-001bb36b8ada in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad bound to our chassis
Jan 22 22:27:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:27.995 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:27:27 compute-0 ovn_controller[94850]: 2026-01-22T22:27:27Z|00184|binding|INFO|Setting lport 6e2e52bd-d176-426a-9062-001bb36b8ada ovn-installed in OVS
Jan 22 22:27:27 compute-0 ovn_controller[94850]: 2026-01-22T22:27:27Z|00185|binding|INFO|Setting lport 6e2e52bd-d176-426a-9062-001bb36b8ada up in Southbound
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:27.999 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.008 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.009 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ef146217-1d36-45e9-87af-abdcc16eedce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.010 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap976277ea-61 in ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.013 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap976277ea-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.013 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bc69f3c0-4f1d-4c0b-8619-17783de081d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.015 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5b3af8-d271-488f-8f83-6b87cb61269d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 systemd-udevd[219497]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.029 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd2c374-1edf-4906-b463-a26c33183f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 systemd-machined[154006]: New machine qemu-27-instance-0000003f.
Jan 22 22:27:28 compute-0 NetworkManager[54954]: <info>  [1769120848.0351] device (tap6e2e52bd-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:27:28 compute-0 NetworkManager[54954]: <info>  [1769120848.0357] device (tap6e2e52bd-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:27:28 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-0000003f.
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.059 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bd628382-1314-4b27-96c0-ed8f27e16874]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.094 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[96ade249-2634-4ac0-a47e-30cb29668602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.102 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[31895d6d-e2ac-45f4-afef-5e6176bce610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 systemd-udevd[219501]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:27:28 compute-0 NetworkManager[54954]: <info>  [1769120848.1046] manager: (tap976277ea-60): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.141 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[06d8283b-0976-4207-a04f-9ab47f2d410f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.145 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7389aad8-0c38-4e42-acd7-07b60cfe2ad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 NetworkManager[54954]: <info>  [1769120848.1722] device (tap976277ea-60): carrier: link connected
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.180 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[925abe6b-e487-4ebb-b670-974ebd379cf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.199 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e48ca3d3-5a2f-4114-843b-a6a5d4576a83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448577, 'reachable_time': 35512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219530, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.218 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d59e7942-4bb0-447a-bd77-ab8c51c0c6f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:95d6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448577, 'tstamp': 448577}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219531, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.240 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1b75cd62-9669-4905-b477-84aaedc09a00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448577, 'reachable_time': 35512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219534, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.281 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a08d5289-e210-4324-bfd6-5f6fd73a41e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.338 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for a257831f-dd8b-4af2-afa1-e7cc926e23cf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.338 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120848.3377593, a257831f-dd8b-4af2-afa1-e7cc926e23cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.339 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] VM Resumed (Lifecycle Event)
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.341 182729 DEBUG nova.compute.manager [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.347 182729 INFO nova.virt.libvirt.driver [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance running successfully.
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.348 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6aee8374-0754-4d13-9cc9-226ab28b6a06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 virtqemud[182297]: argument unsupported: QEMU guest agent is not configured
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.349 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.351 182729 DEBUG nova.virt.libvirt.guest [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.352 182729 DEBUG nova.virt.libvirt.driver [None req-a538cdbb-878b-437e-bcb0-86f7926f8251 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.351 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.354 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap976277ea-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.356 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:28 compute-0 NetworkManager[54954]: <info>  [1769120848.3573] manager: (tap976277ea-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 22 22:27:28 compute-0 kernel: tap976277ea-60: entered promiscuous mode
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.359 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.362 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap976277ea-60, col_values=(('external_ids', {'iface-id': '06db452d-91a0-4ebb-b584-a57953634a03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.363 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:28 compute-0 ovn_controller[94850]: 2026-01-22T22:27:28Z|00186|binding|INFO|Releasing lport 06db452d-91a0-4ebb-b584-a57953634a03 from this chassis (sb_readonly=0)
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.369 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.372 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.392 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.393 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.394 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[830af71d-37b3-4b43-be1e-4bacbff81b4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.396 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:27:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:28.397 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'env', 'PROCESS_TAG=haproxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/976277ea-61b2-4223-a8f7-3d46bf9c98ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.404 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.404 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120848.3406932, a257831f-dd8b-4af2-afa1-e7cc926e23cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.404 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] VM Started (Lifecycle Event)
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.436 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.441 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.517 182729 DEBUG nova.compute.manager [req-10d3dda0-2aba-4db4-8bd1-ff31cb01523a req-a4ddb56b-d416-4cfe-af8f-846f2ca62a95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.518 182729 DEBUG oslo_concurrency.lockutils [req-10d3dda0-2aba-4db4-8bd1-ff31cb01523a req-a4ddb56b-d416-4cfe-af8f-846f2ca62a95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.519 182729 DEBUG oslo_concurrency.lockutils [req-10d3dda0-2aba-4db4-8bd1-ff31cb01523a req-a4ddb56b-d416-4cfe-af8f-846f2ca62a95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.519 182729 DEBUG oslo_concurrency.lockutils [req-10d3dda0-2aba-4db4-8bd1-ff31cb01523a req-a4ddb56b-d416-4cfe-af8f-846f2ca62a95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.519 182729 DEBUG nova.compute.manager [req-10d3dda0-2aba-4db4-8bd1-ff31cb01523a req-a4ddb56b-d416-4cfe-af8f-846f2ca62a95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] No waiting events found dispatching network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.520 182729 WARNING nova.compute.manager [req-10d3dda0-2aba-4db4-8bd1-ff31cb01523a req-a4ddb56b-d416-4cfe-af8f-846f2ca62a95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received unexpected event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada for instance with vm_state resized and task_state None.
Jan 22 22:27:28 compute-0 nova_compute[182725]: 2026-01-22 22:27:28.796 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:28 compute-0 podman[219571]: 2026-01-22 22:27:28.832903782 +0000 UTC m=+0.029105550 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:27:29 compute-0 podman[219571]: 2026-01-22 22:27:29.052839168 +0000 UTC m=+0.249040856 container create 97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:27:29 compute-0 systemd[1]: Started libpod-conmon-97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8.scope.
Jan 22 22:27:29 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:27:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14f29d8ce74002c3d5f00f33137b928ce4d03594150afb68194cce104473ae29/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:27:29 compute-0 podman[219571]: 2026-01-22 22:27:29.230071439 +0000 UTC m=+0.426273197 container init 97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:27:29 compute-0 podman[219571]: 2026-01-22 22:27:29.238565934 +0000 UTC m=+0.434767642 container start 97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:27:29 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [NOTICE]   (219593) : New worker (219595) forked
Jan 22 22:27:29 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [NOTICE]   (219593) : Loading success.
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.356 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.357 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.357 182729 DEBUG nova.compute.manager [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Going to confirm migration 11 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.390 182729 DEBUG nova.objects.instance [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'info_cache' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.620 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.621 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.621 182729 DEBUG nova.network.neutron [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.688 182729 DEBUG nova.compute.manager [req-97c935c3-c546-4188-bbf0-705e89de778d req-bf1ee6a6-9589-45e5-b1dd-b843dd1f4d84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.689 182729 DEBUG oslo_concurrency.lockutils [req-97c935c3-c546-4188-bbf0-705e89de778d req-bf1ee6a6-9589-45e5-b1dd-b843dd1f4d84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.689 182729 DEBUG oslo_concurrency.lockutils [req-97c935c3-c546-4188-bbf0-705e89de778d req-bf1ee6a6-9589-45e5-b1dd-b843dd1f4d84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.689 182729 DEBUG oslo_concurrency.lockutils [req-97c935c3-c546-4188-bbf0-705e89de778d req-bf1ee6a6-9589-45e5-b1dd-b843dd1f4d84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.690 182729 DEBUG nova.compute.manager [req-97c935c3-c546-4188-bbf0-705e89de778d req-bf1ee6a6-9589-45e5-b1dd-b843dd1f4d84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] No waiting events found dispatching network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:27:30 compute-0 nova_compute[182725]: 2026-01-22 22:27:30.690 182729 WARNING nova.compute.manager [req-97c935c3-c546-4188-bbf0-705e89de778d req-bf1ee6a6-9589-45e5-b1dd-b843dd1f4d84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received unexpected event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada for instance with vm_state resized and task_state deleting.
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.761 182729 DEBUG nova.network.neutron [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating instance_info_cache with network_info: [{"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.783 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-a257831f-dd8b-4af2-afa1-e7cc926e23cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.784 182729 DEBUG nova.objects.instance [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'migration_context' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.799 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.800 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.893 182729 DEBUG nova.compute.provider_tree [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.908 182729 DEBUG nova.scheduler.client.report [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:27:31 compute-0 nova_compute[182725]: 2026-01-22 22:27:31.980 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.159 182729 INFO nova.scheduler.client.report [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Deleted allocation for migration aac9dbe7-76f6-46df-ae50-ed1da11d6e54
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.252 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 1.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.314 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.315 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.316 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.316 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.317 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.334 182729 INFO nova.compute.manager [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Terminating instance
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.348 182729 DEBUG nova.compute.manager [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:27:32 compute-0 kernel: tap6e2e52bd-d1 (unregistering): left promiscuous mode
Jan 22 22:27:32 compute-0 NetworkManager[54954]: <info>  [1769120852.3714] device (tap6e2e52bd-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:27:32 compute-0 ovn_controller[94850]: 2026-01-22T22:27:32Z|00187|binding|INFO|Releasing lport 6e2e52bd-d176-426a-9062-001bb36b8ada from this chassis (sb_readonly=0)
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.377 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:32 compute-0 ovn_controller[94850]: 2026-01-22T22:27:32Z|00188|binding|INFO|Setting lport 6e2e52bd-d176-426a-9062-001bb36b8ada down in Southbound
Jan 22 22:27:32 compute-0 ovn_controller[94850]: 2026-01-22T22:27:32Z|00189|binding|INFO|Removing iface tap6e2e52bd-d1 ovn-installed in OVS
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.380 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.388 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:e2:5d 10.100.0.9'], port_security=['fa:16:3e:bb:e2:5d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a257831f-dd8b-4af2-afa1-e7cc926e23cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=6e2e52bd-d176-426a-9062-001bb36b8ada) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.390 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 6e2e52bd-d176-426a-9062-001bb36b8ada in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad unbound from our chassis
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.393 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.394 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c76419-6d72-444a-ba26-9a7f4c64ca2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.395 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.395 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace which is not needed anymore
Jan 22 22:27:32 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Jan 22 22:27:32 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003f.scope: Consumed 4.382s CPU time.
Jan 22 22:27:32 compute-0 systemd-machined[154006]: Machine qemu-27-instance-0000003f terminated.
Jan 22 22:27:32 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [NOTICE]   (219593) : haproxy version is 2.8.14-c23fe91
Jan 22 22:27:32 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [NOTICE]   (219593) : path to executable is /usr/sbin/haproxy
Jan 22 22:27:32 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [WARNING]  (219593) : Exiting Master process...
Jan 22 22:27:32 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [WARNING]  (219593) : Exiting Master process...
Jan 22 22:27:32 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [ALERT]    (219593) : Current worker (219595) exited with code 143 (Terminated)
Jan 22 22:27:32 compute-0 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[219589]: [WARNING]  (219593) : All workers exited. Exiting... (0)
Jan 22 22:27:32 compute-0 systemd[1]: libpod-97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8.scope: Deactivated successfully.
Jan 22 22:27:32 compute-0 podman[219629]: 2026-01-22 22:27:32.608736555 +0000 UTC m=+0.072263786 container died 97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.631 182729 INFO nova.virt.libvirt.driver [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Instance destroyed successfully.
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.632 182729 DEBUG nova.objects.instance [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'resources' on Instance uuid a257831f-dd8b-4af2-afa1-e7cc926e23cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:27:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8-userdata-shm.mount: Deactivated successfully.
Jan 22 22:27:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-14f29d8ce74002c3d5f00f33137b928ce4d03594150afb68194cce104473ae29-merged.mount: Deactivated successfully.
Jan 22 22:27:32 compute-0 podman[219629]: 2026-01-22 22:27:32.647537321 +0000 UTC m=+0.111064542 container cleanup 97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.655 182729 DEBUG nova.virt.libvirt.vif [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-240266002',display_name='tempest-DeleteServersTestJSON-server-240266002',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-240266002',id=63,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-s4xtmr4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:27:28Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a257831f-dd8b-4af2-afa1-e7cc926e23cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.656 182729 DEBUG nova.network.os_vif_util [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "6e2e52bd-d176-426a-9062-001bb36b8ada", "address": "fa:16:3e:bb:e2:5d", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e2e52bd-d1", "ovs_interfaceid": "6e2e52bd-d176-426a-9062-001bb36b8ada", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.657 182729 DEBUG nova.network.os_vif_util [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.658 182729 DEBUG os_vif [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.664 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:32 compute-0 systemd[1]: libpod-conmon-97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8.scope: Deactivated successfully.
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.665 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e2e52bd-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.667 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.671 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.675 182729 INFO os_vif [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e2:5d,bridge_name='br-int',has_traffic_filtering=True,id=6e2e52bd-d176-426a-9062-001bb36b8ada,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e2e52bd-d1')
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.677 182729 INFO nova.virt.libvirt.driver [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Deleting instance files /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_del
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.682 182729 INFO nova.virt.libvirt.driver [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Deletion of /var/lib/nova/instances/a257831f-dd8b-4af2-afa1-e7cc926e23cf_del complete
Jan 22 22:27:32 compute-0 podman[219676]: 2026-01-22 22:27:32.737766442 +0000 UTC m=+0.056041124 container remove 97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.744 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae33767-380b-4630-ae85-dc8489096076]: (4, ('Thu Jan 22 10:27:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8)\n97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8\nThu Jan 22 10:27:32 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8)\n97a97415529abed1fc6310f7a58bfb8b5cd821c63becf0d9037802a3576b9de8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.747 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fafca979-b7fa-4332-8b03-d9d2c46773c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.749 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.752 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:32 compute-0 kernel: tap976277ea-60: left promiscuous mode
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.779 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.783 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2229f4-336a-4458-81b6-3392cd337ded]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.807 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[906f0fe7-3582-4f67-ab83-ece74c84c938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.809 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[21b37c2f-ec7b-4495-b682-065757b0fb06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.822 182729 DEBUG nova.compute.manager [req-ce141f3c-c3f9-417e-9d7b-84cb831f4e06 req-30d51630-6919-4421-89ca-7837ac15159a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-unplugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.823 182729 DEBUG oslo_concurrency.lockutils [req-ce141f3c-c3f9-417e-9d7b-84cb831f4e06 req-30d51630-6919-4421-89ca-7837ac15159a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.824 182729 DEBUG oslo_concurrency.lockutils [req-ce141f3c-c3f9-417e-9d7b-84cb831f4e06 req-30d51630-6919-4421-89ca-7837ac15159a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.825 182729 DEBUG oslo_concurrency.lockutils [req-ce141f3c-c3f9-417e-9d7b-84cb831f4e06 req-30d51630-6919-4421-89ca-7837ac15159a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.825 182729 DEBUG nova.compute.manager [req-ce141f3c-c3f9-417e-9d7b-84cb831f4e06 req-30d51630-6919-4421-89ca-7837ac15159a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] No waiting events found dispatching network-vif-unplugged-6e2e52bd-d176-426a-9062-001bb36b8ada pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.825 182729 WARNING nova.compute.manager [req-ce141f3c-c3f9-417e-9d7b-84cb831f4e06 req-30d51630-6919-4421-89ca-7837ac15159a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received unexpected event network-vif-unplugged-6e2e52bd-d176-426a-9062-001bb36b8ada for instance with vm_state active and task_state None.
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.827 182729 INFO nova.compute.manager [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Took 0.48 seconds to destroy the instance on the hypervisor.
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.828 182729 DEBUG oslo.service.loopingcall [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.829 182729 DEBUG nova.compute.manager [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:27:32 compute-0 nova_compute[182725]: 2026-01-22 22:27:32.829 182729 DEBUG nova.network.neutron [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.840 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6986b155-1b52-410b-ba74-e6403ba2a0ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448569, 'reachable_time': 16756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219691, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.844 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:27:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:32.845 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[af0d467c-07d2-49d0-af53-03c75e762c5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:27:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d976277ea\x2d61b2\x2d4223\x2da8f7\x2d3d46bf9c98ad.mount: Deactivated successfully.
Jan 22 22:27:33 compute-0 nova_compute[182725]: 2026-01-22 22:27:33.798 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.317 182729 DEBUG nova.compute.manager [req-d95ac826-b2e2-4e67-845c-034cb2b56c31 req-500c9125-3217-46d0-b643-dac44fe84c4c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.318 182729 DEBUG oslo_concurrency.lockutils [req-d95ac826-b2e2-4e67-845c-034cb2b56c31 req-500c9125-3217-46d0-b643-dac44fe84c4c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.318 182729 DEBUG oslo_concurrency.lockutils [req-d95ac826-b2e2-4e67-845c-034cb2b56c31 req-500c9125-3217-46d0-b643-dac44fe84c4c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.319 182729 DEBUG oslo_concurrency.lockutils [req-d95ac826-b2e2-4e67-845c-034cb2b56c31 req-500c9125-3217-46d0-b643-dac44fe84c4c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.319 182729 DEBUG nova.compute.manager [req-d95ac826-b2e2-4e67-845c-034cb2b56c31 req-500c9125-3217-46d0-b643-dac44fe84c4c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] No waiting events found dispatching network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.319 182729 WARNING nova.compute.manager [req-d95ac826-b2e2-4e67-845c-034cb2b56c31 req-500c9125-3217-46d0-b643-dac44fe84c4c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received unexpected event network-vif-plugged-6e2e52bd-d176-426a-9062-001bb36b8ada for instance with vm_state active and task_state None.
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.486 182729 DEBUG nova.network.neutron [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.516 182729 INFO nova.compute.manager [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Took 2.69 seconds to deallocate network for instance.
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.581 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.582 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.588 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.611 182729 INFO nova.scheduler.client.report [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Deleted allocations for instance a257831f-dd8b-4af2-afa1-e7cc926e23cf
Jan 22 22:27:35 compute-0 nova_compute[182725]: 2026-01-22 22:27:35.698 182729 DEBUG oslo_concurrency.lockutils [None req-69bfd67b-52bf-49f3-a977-f4ee31883e8e 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a257831f-dd8b-4af2-afa1-e7cc926e23cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:27:37 compute-0 nova_compute[182725]: 2026-01-22 22:27:37.426 182729 DEBUG nova.compute.manager [req-9285e6b9-e9eb-41fc-93dd-9e6c9d9f277d req-21fdcf27-efe9-499d-bfe5-e341b30a9cc9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Received event network-vif-deleted-6e2e52bd-d176-426a-9062-001bb36b8ada external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:27:37 compute-0 nova_compute[182725]: 2026-01-22 22:27:37.672 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:38 compute-0 nova_compute[182725]: 2026-01-22 22:27:38.802 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:39 compute-0 podman[219692]: 2026-01-22 22:27:39.160893628 +0000 UTC m=+0.089901244 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 22:27:40 compute-0 nova_compute[182725]: 2026-01-22 22:27:40.734 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:41 compute-0 podman[219715]: 2026-01-22 22:27:41.147865561 +0000 UTC m=+0.070337497 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, version=9.6, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter)
Jan 22 22:27:41 compute-0 podman[219714]: 2026-01-22 22:27:41.193850359 +0000 UTC m=+0.123713453 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:27:42 compute-0 nova_compute[182725]: 2026-01-22 22:27:42.676 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:43 compute-0 nova_compute[182725]: 2026-01-22 22:27:43.805 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:47.373 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:27:47 compute-0 nova_compute[182725]: 2026-01-22 22:27:47.374 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:47.375 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:27:47 compute-0 nova_compute[182725]: 2026-01-22 22:27:47.630 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120852.6292665, a257831f-dd8b-4af2-afa1-e7cc926e23cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:27:47 compute-0 nova_compute[182725]: 2026-01-22 22:27:47.633 182729 INFO nova.compute.manager [-] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] VM Stopped (Lifecycle Event)
Jan 22 22:27:47 compute-0 nova_compute[182725]: 2026-01-22 22:27:47.661 182729 DEBUG nova.compute.manager [None req-a5b13224-e863-4431-a668-f3b6df01a4ad - - - - - -] [instance: a257831f-dd8b-4af2-afa1-e7cc926e23cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:27:47 compute-0 nova_compute[182725]: 2026-01-22 22:27:47.679 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:48 compute-0 nova_compute[182725]: 2026-01-22 22:27:48.806 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:51 compute-0 podman[219760]: 2026-01-22 22:27:51.139216549 +0000 UTC m=+0.072260756 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 22:27:51 compute-0 podman[219761]: 2026-01-22 22:27:51.165768383 +0000 UTC m=+0.088008385 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:27:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:27:52.377 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:27:52 compute-0 nova_compute[182725]: 2026-01-22 22:27:52.683 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:53 compute-0 nova_compute[182725]: 2026-01-22 22:27:53.807 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:55 compute-0 podman[219803]: 2026-01-22 22:27:55.136193468 +0000 UTC m=+0.065761521 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:27:57 compute-0 nova_compute[182725]: 2026-01-22 22:27:57.686 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:27:58 compute-0 nova_compute[182725]: 2026-01-22 22:27:58.809 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:02 compute-0 nova_compute[182725]: 2026-01-22 22:28:02.691 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:03 compute-0 nova_compute[182725]: 2026-01-22 22:28:03.811 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:07 compute-0 nova_compute[182725]: 2026-01-22 22:28:07.695 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:08 compute-0 nova_compute[182725]: 2026-01-22 22:28:08.814 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:28:09.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:28:10 compute-0 podman[219828]: 2026-01-22 22:28:10.139747497 +0000 UTC m=+0.073485856 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 22:28:11 compute-0 nova_compute[182725]: 2026-01-22 22:28:11.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.081 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.082 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.083 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.083 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:28:12 compute-0 podman[219849]: 2026-01-22 22:28:12.171005975 +0000 UTC m=+0.094792759 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Jan 22 22:28:12 compute-0 podman[219848]: 2026-01-22 22:28:12.214758246 +0000 UTC m=+0.138761085 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.304 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.306 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5642MB free_disk=73.37689208984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.306 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.306 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.392 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.393 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.424 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:28:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:12.435 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:12.435 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:12.436 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.443 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.469 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.470 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:12 compute-0 nova_compute[182725]: 2026-01-22 22:28:12.697 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:13 compute-0 nova_compute[182725]: 2026-01-22 22:28:13.465 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:13 compute-0 nova_compute[182725]: 2026-01-22 22:28:13.817 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:14 compute-0 nova_compute[182725]: 2026-01-22 22:28:14.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:14 compute-0 nova_compute[182725]: 2026-01-22 22:28:14.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:28:14 compute-0 nova_compute[182725]: 2026-01-22 22:28:14.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:28:14 compute-0 nova_compute[182725]: 2026-01-22 22:28:14.909 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:28:15 compute-0 nova_compute[182725]: 2026-01-22 22:28:15.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:16 compute-0 nova_compute[182725]: 2026-01-22 22:28:16.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:17 compute-0 ovn_controller[94850]: 2026-01-22T22:28:17Z|00190|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 22:28:17 compute-0 nova_compute[182725]: 2026-01-22 22:28:17.702 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:17 compute-0 nova_compute[182725]: 2026-01-22 22:28:17.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:17 compute-0 nova_compute[182725]: 2026-01-22 22:28:17.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:28:18 compute-0 nova_compute[182725]: 2026-01-22 22:28:18.819 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:19 compute-0 nova_compute[182725]: 2026-01-22 22:28:19.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:21 compute-0 nova_compute[182725]: 2026-01-22 22:28:21.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.066 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.066 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.086 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:28:22 compute-0 podman[219896]: 2026-01-22 22:28:22.135031098 +0000 UTC m=+0.057433170 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:28:22 compute-0 podman[219895]: 2026-01-22 22:28:22.173929646 +0000 UTC m=+0.094457810 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.193 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.194 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.200 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.200 182729 INFO nova.compute.claims [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.321 182729 DEBUG nova.compute.provider_tree [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.341 182729 DEBUG nova.scheduler.client.report [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.366 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.367 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.438 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.440 182729 DEBUG nova.network.neutron [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.472 182729 INFO nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.486 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.642 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.644 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.644 182729 INFO nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Creating image(s)
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.645 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "/var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.645 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "/var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.646 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "/var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.663 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.706 182729 DEBUG nova.policy [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dfe595ef2d8b4e2fa64dbf2a2c3b64ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f3faa2e2f55846f28c226341525ab1cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.709 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.728 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.728 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.729 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.742 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.811 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:22 compute-0 nova_compute[182725]: 2026-01-22 22:28:22.812 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.357 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk 1073741824" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.358 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.359 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.451 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.453 182729 DEBUG nova.virt.disk.api [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Checking if we can resize image /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.454 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.535 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.537 182729 DEBUG nova.virt.disk.api [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Cannot resize image /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.538 182729 DEBUG nova.objects.instance [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lazy-loading 'migration_context' on Instance uuid 9861fe27-e0f7-43cb-975c-6ff504da2c51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.558 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.558 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Ensure instance console log exists: /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.559 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.560 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.561 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.677 182729 DEBUG nova.network.neutron [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Successfully created port: c4cbdd98-0f34-4993-839d-1386995755ce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.822 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:23 compute-0 nova_compute[182725]: 2026-01-22 22:28:23.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.379 182729 DEBUG nova.network.neutron [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Successfully updated port: c4cbdd98-0f34-4993-839d-1386995755ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.399 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.400 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquired lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.400 182729 DEBUG nova.network.neutron [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.555 182729 DEBUG nova.compute.manager [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-changed-c4cbdd98-0f34-4993-839d-1386995755ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.556 182729 DEBUG nova.compute.manager [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Refreshing instance network info cache due to event network-changed-c4cbdd98-0f34-4993-839d-1386995755ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.556 182729 DEBUG oslo_concurrency.lockutils [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:28:25 compute-0 nova_compute[182725]: 2026-01-22 22:28:25.647 182729 DEBUG nova.network.neutron [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:28:26 compute-0 podman[219954]: 2026-01-22 22:28:26.121886612 +0000 UTC m=+0.055513901 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.705 182729 DEBUG nova.network.neutron [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updating instance_info_cache with network_info: [{"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.728 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Releasing lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.728 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Instance network_info: |[{"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.729 182729 DEBUG oslo_concurrency.lockutils [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.729 182729 DEBUG nova.network.neutron [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Refreshing network info cache for port c4cbdd98-0f34-4993-839d-1386995755ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.732 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Start _get_guest_xml network_info=[{"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.739 182729 WARNING nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.745 182729 DEBUG nova.virt.libvirt.host [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.746 182729 DEBUG nova.virt.libvirt.host [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.752 182729 DEBUG nova.virt.libvirt.host [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.752 182729 DEBUG nova.virt.libvirt.host [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.753 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.754 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.754 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.754 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.755 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.755 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.755 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.755 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.756 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.756 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.756 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.756 182729 DEBUG nova.virt.hardware [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.759 182729 DEBUG nova.virt.libvirt.vif [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:28:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1567633822',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1567633822',id=70,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f3faa2e2f55846f28c226341525ab1cd',ramdisk_id='',reservation_id='r-iwl13bk2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1156061049',owner_user_name='tempest-AttachInterfacesV270Test-1156061049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:28:22Z,user_data=None,user_id='dfe595ef2d8b4e2fa64dbf2a2c3b64ba',uuid=9861fe27-e0f7-43cb-975c-6ff504da2c51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.760 182729 DEBUG nova.network.os_vif_util [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converting VIF {"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.761 182729 DEBUG nova.network.os_vif_util [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:6d:75,bridge_name='br-int',has_traffic_filtering=True,id=c4cbdd98-0f34-4993-839d-1386995755ce,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4cbdd98-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.762 182729 DEBUG nova.objects.instance [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lazy-loading 'pci_devices' on Instance uuid 9861fe27-e0f7-43cb-975c-6ff504da2c51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.774 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <uuid>9861fe27-e0f7-43cb-975c-6ff504da2c51</uuid>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <name>instance-00000046</name>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <nova:name>tempest-AttachInterfacesV270Test-server-1567633822</nova:name>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:28:26</nova:creationTime>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:user uuid="dfe595ef2d8b4e2fa64dbf2a2c3b64ba">tempest-AttachInterfacesV270Test-1156061049-project-member</nova:user>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:project uuid="f3faa2e2f55846f28c226341525ab1cd">tempest-AttachInterfacesV270Test-1156061049</nova:project>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         <nova:port uuid="c4cbdd98-0f34-4993-839d-1386995755ce">
Jan 22 22:28:26 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <system>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <entry name="serial">9861fe27-e0f7-43cb-975c-6ff504da2c51</entry>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <entry name="uuid">9861fe27-e0f7-43cb-975c-6ff504da2c51</entry>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </system>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <os>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   </os>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <features>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   </features>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk.config"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:d4:6d:75"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <target dev="tapc4cbdd98-0f"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/console.log" append="off"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <video>
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </video>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:28:26 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:28:26 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:28:26 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:28:26 compute-0 nova_compute[182725]: </domain>
Jan 22 22:28:26 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.775 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Preparing to wait for external event network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.776 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.776 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.777 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.778 182729 DEBUG nova.virt.libvirt.vif [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:28:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1567633822',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1567633822',id=70,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f3faa2e2f55846f28c226341525ab1cd',ramdisk_id='',reservation_id='r-iwl13bk2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1156061049',owner_user_name='tempest-AttachInterfacesV270Test-1156061049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:28:22Z,user_data=None,user_id='dfe595ef2d8b4e2fa64dbf2a2c3b64ba',uuid=9861fe27-e0f7-43cb-975c-6ff504da2c51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.778 182729 DEBUG nova.network.os_vif_util [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converting VIF {"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.780 182729 DEBUG nova.network.os_vif_util [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:6d:75,bridge_name='br-int',has_traffic_filtering=True,id=c4cbdd98-0f34-4993-839d-1386995755ce,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4cbdd98-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.780 182729 DEBUG os_vif [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:6d:75,bridge_name='br-int',has_traffic_filtering=True,id=c4cbdd98-0f34-4993-839d-1386995755ce,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4cbdd98-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.781 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.782 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.782 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.787 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.787 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4cbdd98-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.788 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc4cbdd98-0f, col_values=(('external_ids', {'iface-id': 'c4cbdd98-0f34-4993-839d-1386995755ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:6d:75', 'vm-uuid': '9861fe27-e0f7-43cb-975c-6ff504da2c51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.790 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:26 compute-0 NetworkManager[54954]: <info>  [1769120906.7909] manager: (tapc4cbdd98-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.801 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.803 182729 INFO os_vif [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:6d:75,bridge_name='br-int',has_traffic_filtering=True,id=c4cbdd98-0f34-4993-839d-1386995755ce,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4cbdd98-0f')
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.861 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.861 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.861 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] No VIF found with MAC fa:16:3e:d4:6d:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:28:26 compute-0 nova_compute[182725]: 2026-01-22 22:28:26.862 182729 INFO nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Using config drive
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.513 182729 INFO nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Creating config drive at /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk.config
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.518 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gcr66qw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.647 182729 DEBUG oslo_concurrency.processutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gcr66qw" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:27 compute-0 kernel: tapc4cbdd98-0f: entered promiscuous mode
Jan 22 22:28:27 compute-0 NetworkManager[54954]: <info>  [1769120907.7322] manager: (tapc4cbdd98-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 22 22:28:27 compute-0 ovn_controller[94850]: 2026-01-22T22:28:27Z|00191|binding|INFO|Claiming lport c4cbdd98-0f34-4993-839d-1386995755ce for this chassis.
Jan 22 22:28:27 compute-0 ovn_controller[94850]: 2026-01-22T22:28:27Z|00192|binding|INFO|c4cbdd98-0f34-4993-839d-1386995755ce: Claiming fa:16:3e:d4:6d:75 10.100.0.10
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.733 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.737 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:27 compute-0 systemd-udevd[219996]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.780 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:6d:75 10.100.0.10'], port_security=['fa:16:3e:d4:6d:75 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9861fe27-e0f7-43cb-975c-6ff504da2c51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f3faa2e2f55846f28c226341525ab1cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6665a7e0-3e36-4374-9049-2cd4a3bb8819', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c52297c-362e-4500-ae79-b504ff1ae574, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c4cbdd98-0f34-4993-839d-1386995755ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.782 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c4cbdd98-0f34-4993-839d-1386995755ce in datapath 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb bound to our chassis
Jan 22 22:28:27 compute-0 systemd-machined[154006]: New machine qemu-28-instance-00000046.
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.783 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb
Jan 22 22:28:27 compute-0 NetworkManager[54954]: <info>  [1769120907.7879] device (tapc4cbdd98-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:28:27 compute-0 NetworkManager[54954]: <info>  [1769120907.7884] device (tapc4cbdd98-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:27 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-00000046.
Jan 22 22:28:27 compute-0 ovn_controller[94850]: 2026-01-22T22:28:27Z|00193|binding|INFO|Setting lport c4cbdd98-0f34-4993-839d-1386995755ce ovn-installed in OVS
Jan 22 22:28:27 compute-0 ovn_controller[94850]: 2026-01-22T22:28:27Z|00194|binding|INFO|Setting lport c4cbdd98-0f34-4993-839d-1386995755ce up in Southbound
Jan 22 22:28:27 compute-0 nova_compute[182725]: 2026-01-22 22:28:27.797 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.798 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f45aef0d-6e13-4e94-b88d-962270fc3844]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.799 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2c3fc5f2-81 in ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.802 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2c3fc5f2-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.802 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3d5232e4-07dc-49a7-a7ca-9a5bf4c09ae5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.804 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[daf1eb5d-af6e-4750-827a-53c1bf8b6284]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.818 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c75ca0-9599-4ffb-9537-7f607bd9b0e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.843 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2f5ebd09-701b-4a3a-b867-5fcd6176d5dc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.883 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f471a9d6-6803-483e-9cd9-722c2e4bf5d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.891 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f42677-2581-4c1b-a443-e6e3e78d5997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 NetworkManager[54954]: <info>  [1769120907.8935] manager: (tap2c3fc5f2-80): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.921 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1ffe28-60bb-4881-bac2-cf2020e6dcc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.925 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3871daf3-9e6a-4d42-8c6e-379cdc95d0f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 NetworkManager[54954]: <info>  [1769120907.9535] device (tap2c3fc5f2-80): carrier: link connected
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.963 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[aca106ed-337a-4b2f-941f-040a54f9cfe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:27.990 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbfb5ce-ff3c-4c30-80b1-f5e98748cedd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c3fc5f2-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:72:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454555, 'reachable_time': 28360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220036, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.013 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[061f171f-02ed-4e2d-9858-dd2714e09545]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:72b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454555, 'tstamp': 454555}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220037, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.043 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82df60f8-134f-4ce3-a1cd-59b97fbd3527]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c3fc5f2-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:72:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454555, 'reachable_time': 28360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220039, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.060 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120908.0589988, 9861fe27-e0f7-43cb-975c-6ff504da2c51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.060 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] VM Started (Lifecycle Event)
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.087 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.090 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[819d0028-f95a-4020-9f00-063e9b7645e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.093 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120908.0597293, 9861fe27-e0f7-43cb-975c-6ff504da2c51 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.094 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] VM Paused (Lifecycle Event)
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.114 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.121 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.151 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.177 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[99961909-4794-4901-b4b3-24d46c79f319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.179 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c3fc5f2-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.179 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.180 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c3fc5f2-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.183 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:28 compute-0 NetworkManager[54954]: <info>  [1769120908.1839] manager: (tap2c3fc5f2-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 22 22:28:28 compute-0 kernel: tap2c3fc5f2-80: entered promiscuous mode
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.189 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2c3fc5f2-80, col_values=(('external_ids', {'iface-id': '77819b55-5285-4abb-9e9a-657dee5f7099'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:28 compute-0 ovn_controller[94850]: 2026-01-22T22:28:28Z|00195|binding|INFO|Releasing lport 77819b55-5285-4abb-9e9a-657dee5f7099 from this chassis (sb_readonly=0)
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.193 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.195 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.195 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fb975b84-74d8-4c68-a915-9a82c34a893b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.196 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb.pid.haproxy
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:28:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:28.197 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'env', 'PROCESS_TAG=haproxy-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.205 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:28 compute-0 podman[220071]: 2026-01-22 22:28:28.591162262 +0000 UTC m=+0.064106929 container create 69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:28:28 compute-0 podman[220071]: 2026-01-22 22:28:28.553528436 +0000 UTC m=+0.026473153 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:28:28 compute-0 systemd[1]: Started libpod-conmon-69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8.scope.
Jan 22 22:28:28 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.691 182729 DEBUG nova.network.neutron [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updated VIF entry in instance network info cache for port c4cbdd98-0f34-4993-839d-1386995755ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.692 182729 DEBUG nova.network.neutron [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updating instance_info_cache with network_info: [{"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4873aa2be51b273df6fc42df63695583856a67ba0b820790a8b55c31576683d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.711 182729 DEBUG oslo_concurrency.lockutils [req-1d1f5d1a-f25b-475f-8c67-547efdcbd38a req-fbf8f3d2-dc3c-4e4c-af30-b1967128810b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:28:28 compute-0 podman[220071]: 2026-01-22 22:28:28.715296994 +0000 UTC m=+0.188241671 container init 69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 22:28:28 compute-0 podman[220071]: 2026-01-22 22:28:28.721256616 +0000 UTC m=+0.194201273 container start 69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:28:28 compute-0 neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb[220087]: [NOTICE]   (220091) : New worker (220093) forked
Jan 22 22:28:28 compute-0 neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb[220087]: [NOTICE]   (220091) : Loading success.
Jan 22 22:28:28 compute-0 nova_compute[182725]: 2026-01-22 22:28:28.823 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:31 compute-0 nova_compute[182725]: 2026-01-22 22:28:31.791 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.826 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.956 182729 DEBUG nova.compute.manager [req-9dfab473-8853-46fa-b3f7-8d8bcfcced86 req-6b95221a-6c28-452f-8132-aee21d7ed58e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.957 182729 DEBUG oslo_concurrency.lockutils [req-9dfab473-8853-46fa-b3f7-8d8bcfcced86 req-6b95221a-6c28-452f-8132-aee21d7ed58e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.958 182729 DEBUG oslo_concurrency.lockutils [req-9dfab473-8853-46fa-b3f7-8d8bcfcced86 req-6b95221a-6c28-452f-8132-aee21d7ed58e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.958 182729 DEBUG oslo_concurrency.lockutils [req-9dfab473-8853-46fa-b3f7-8d8bcfcced86 req-6b95221a-6c28-452f-8132-aee21d7ed58e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.959 182729 DEBUG nova.compute.manager [req-9dfab473-8853-46fa-b3f7-8d8bcfcced86 req-6b95221a-6c28-452f-8132-aee21d7ed58e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Processing event network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.960 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.964 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120913.964253, 9861fe27-e0f7-43cb-975c-6ff504da2c51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.965 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] VM Resumed (Lifecycle Event)
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.968 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.971 182729 INFO nova.virt.libvirt.driver [-] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Instance spawned successfully.
Jan 22 22:28:33 compute-0 nova_compute[182725]: 2026-01-22 22:28:33.972 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.001 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.007 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.012 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.012 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.013 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.014 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.014 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.015 182729 DEBUG nova.virt.libvirt.driver [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.048 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.102 182729 INFO nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Took 11.46 seconds to spawn the instance on the hypervisor.
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.102 182729 DEBUG nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.208 182729 INFO nova.compute.manager [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Took 12.05 seconds to build instance.
Jan 22 22:28:34 compute-0 nova_compute[182725]: 2026-01-22 22:28:34.241 182729 DEBUG oslo_concurrency.lockutils [None req-ef6e896b-ce38-4d2d-b16a-e8e64e778ddb dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.205 182729 DEBUG nova.compute.manager [req-a5e7b94c-1380-481e-919d-b9f2b07e25e0 req-f6cd174b-1c13-4de7-8297-071fc5acfdf9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.205 182729 DEBUG oslo_concurrency.lockutils [req-a5e7b94c-1380-481e-919d-b9f2b07e25e0 req-f6cd174b-1c13-4de7-8297-071fc5acfdf9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.205 182729 DEBUG oslo_concurrency.lockutils [req-a5e7b94c-1380-481e-919d-b9f2b07e25e0 req-f6cd174b-1c13-4de7-8297-071fc5acfdf9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.206 182729 DEBUG oslo_concurrency.lockutils [req-a5e7b94c-1380-481e-919d-b9f2b07e25e0 req-f6cd174b-1c13-4de7-8297-071fc5acfdf9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.206 182729 DEBUG nova.compute.manager [req-a5e7b94c-1380-481e-919d-b9f2b07e25e0 req-f6cd174b-1c13-4de7-8297-071fc5acfdf9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] No waiting events found dispatching network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.206 182729 WARNING nova.compute.manager [req-a5e7b94c-1380-481e-919d-b9f2b07e25e0 req-f6cd174b-1c13-4de7-8297-071fc5acfdf9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received unexpected event network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce for instance with vm_state active and task_state None.
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.747 182729 DEBUG oslo_concurrency.lockutils [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "interface-9861fe27-e0f7-43cb-975c-6ff504da2c51-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.748 182729 DEBUG oslo_concurrency.lockutils [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "interface-9861fe27-e0f7-43cb-975c-6ff504da2c51-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.749 182729 DEBUG nova.objects.instance [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lazy-loading 'flavor' on Instance uuid 9861fe27-e0f7-43cb-975c-6ff504da2c51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.776 182729 DEBUG nova.objects.instance [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lazy-loading 'pci_requests' on Instance uuid 9861fe27-e0f7-43cb-975c-6ff504da2c51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.788 182729 DEBUG nova.network.neutron [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:28:36 compute-0 nova_compute[182725]: 2026-01-22 22:28:36.794 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:37 compute-0 nova_compute[182725]: 2026-01-22 22:28:37.227 182729 DEBUG nova.policy [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dfe595ef2d8b4e2fa64dbf2a2c3b64ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f3faa2e2f55846f28c226341525ab1cd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:28:38 compute-0 nova_compute[182725]: 2026-01-22 22:28:38.240 182729 DEBUG nova.network.neutron [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Successfully created port: 46a27ac7-81fb-4245-8fc9-947332632fd6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:28:38 compute-0 nova_compute[182725]: 2026-01-22 22:28:38.829 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.243 182729 DEBUG nova.network.neutron [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Successfully updated port: 46a27ac7-81fb-4245-8fc9-947332632fd6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.257 182729 DEBUG oslo_concurrency.lockutils [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.257 182729 DEBUG oslo_concurrency.lockutils [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquired lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.258 182729 DEBUG nova.network.neutron [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.428 182729 WARNING nova.network.neutron [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb already exists in list: networks containing: ['2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb']. ignoring it
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.778 182729 DEBUG nova.compute.manager [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-changed-46a27ac7-81fb-4245-8fc9-947332632fd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.778 182729 DEBUG nova.compute.manager [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Refreshing instance network info cache due to event network-changed-46a27ac7-81fb-4245-8fc9-947332632fd6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:28:39 compute-0 nova_compute[182725]: 2026-01-22 22:28:39.778 182729 DEBUG oslo_concurrency.lockutils [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.959 182729 DEBUG nova.network.neutron [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updating instance_info_cache with network_info: [{"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.976 182729 DEBUG oslo_concurrency.lockutils [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Releasing lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.976 182729 DEBUG oslo_concurrency.lockutils [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.977 182729 DEBUG nova.network.neutron [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Refreshing network info cache for port 46a27ac7-81fb-4245-8fc9-947332632fd6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.980 182729 DEBUG nova.virt.libvirt.vif [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:28:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1567633822',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1567633822',id=70,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:28:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f3faa2e2f55846f28c226341525ab1cd',ramdisk_id='',reservation_id='r-iwl13bk2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1156061049',owner_user_name='tempest-AttachInterfacesV270Test-1156061049-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:28:34Z,user_data=None,user_id='dfe595ef2d8b4e2fa64dbf2a2c3b64ba',uuid=9861fe27-e0f7-43cb-975c-6ff504da2c51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.980 182729 DEBUG nova.network.os_vif_util [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converting VIF {"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.981 182729 DEBUG nova.network.os_vif_util [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:17:98,bridge_name='br-int',has_traffic_filtering=True,id=46a27ac7-81fb-4245-8fc9-947332632fd6,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46a27ac7-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.982 182729 DEBUG os_vif [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:17:98,bridge_name='br-int',has_traffic_filtering=True,id=46a27ac7-81fb-4245-8fc9-947332632fd6,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46a27ac7-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.982 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.983 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.983 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.989 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.989 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap46a27ac7-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.990 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap46a27ac7-81, col_values=(('external_ids', {'iface-id': '46a27ac7-81fb-4245-8fc9-947332632fd6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:17:98', 'vm-uuid': '9861fe27-e0f7-43cb-975c-6ff504da2c51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.991 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:40 compute-0 nova_compute[182725]: 2026-01-22 22:28:40.993 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:28:40 compute-0 NetworkManager[54954]: <info>  [1769120920.9956] manager: (tap46a27ac7-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.006 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.008 182729 INFO os_vif [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:17:98,bridge_name='br-int',has_traffic_filtering=True,id=46a27ac7-81fb-4245-8fc9-947332632fd6,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46a27ac7-81')
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.009 182729 DEBUG nova.virt.libvirt.vif [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:28:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1567633822',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1567633822',id=70,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:28:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f3faa2e2f55846f28c226341525ab1cd',ramdisk_id='',reservation_id='r-iwl13bk2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1156061049',owner_user_name='tempest-AttachInterfacesV270Test-1156061049-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:28:34Z,user_data=None,user_id='dfe595ef2d8b4e2fa64dbf2a2c3b64ba',uuid=9861fe27-e0f7-43cb-975c-6ff504da2c51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.009 182729 DEBUG nova.network.os_vif_util [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converting VIF {"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.010 182729 DEBUG nova.network.os_vif_util [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:17:98,bridge_name='br-int',has_traffic_filtering=True,id=46a27ac7-81fb-4245-8fc9-947332632fd6,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46a27ac7-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.013 182729 DEBUG nova.virt.libvirt.guest [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] attach device xml: <interface type="ethernet">
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:ea:17:98"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <target dev="tap46a27ac7-81"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]: </interface>
Jan 22 22:28:41 compute-0 nova_compute[182725]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 22:28:41 compute-0 kernel: tap46a27ac7-81: entered promiscuous mode
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.028 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:41 compute-0 ovn_controller[94850]: 2026-01-22T22:28:41Z|00196|binding|INFO|Claiming lport 46a27ac7-81fb-4245-8fc9-947332632fd6 for this chassis.
Jan 22 22:28:41 compute-0 ovn_controller[94850]: 2026-01-22T22:28:41Z|00197|binding|INFO|46a27ac7-81fb-4245-8fc9-947332632fd6: Claiming fa:16:3e:ea:17:98 10.100.0.5
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.040 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:17:98 10.100.0.5'], port_security=['fa:16:3e:ea:17:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9861fe27-e0f7-43cb-975c-6ff504da2c51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f3faa2e2f55846f28c226341525ab1cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6665a7e0-3e36-4374-9049-2cd4a3bb8819', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c52297c-362e-4500-ae79-b504ff1ae574, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=46a27ac7-81fb-4245-8fc9-947332632fd6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.042 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 46a27ac7-81fb-4245-8fc9-947332632fd6 in datapath 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb bound to our chassis
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.043 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb
Jan 22 22:28:41 compute-0 NetworkManager[54954]: <info>  [1769120921.0445] manager: (tap46a27ac7-81): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Jan 22 22:28:41 compute-0 ovn_controller[94850]: 2026-01-22T22:28:41Z|00198|binding|INFO|Setting lport 46a27ac7-81fb-4245-8fc9-947332632fd6 ovn-installed in OVS
Jan 22 22:28:41 compute-0 ovn_controller[94850]: 2026-01-22T22:28:41Z|00199|binding|INFO|Setting lport 46a27ac7-81fb-4245-8fc9-947332632fd6 up in Southbound
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.049 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.068 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0755c3a0-990c-4c70-adff-0d3fac7947e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:41 compute-0 systemd-udevd[220120]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.104 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f4f3fa-f64d-41d3-a344-c77ec27f575b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.107 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd0fdfd-1644-47a1-a5a1-9be7467d0acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:41 compute-0 NetworkManager[54954]: <info>  [1769120921.1184] device (tap46a27ac7-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:28:41 compute-0 NetworkManager[54954]: <info>  [1769120921.1193] device (tap46a27ac7-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.135 182729 DEBUG nova.virt.libvirt.driver [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.136 182729 DEBUG nova.virt.libvirt.driver [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.136 182729 DEBUG nova.virt.libvirt.driver [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] No VIF found with MAC fa:16:3e:d4:6d:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.136 182729 DEBUG nova.virt.libvirt.driver [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] No VIF found with MAC fa:16:3e:ea:17:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:28:41 compute-0 podman[220107]: 2026-01-22 22:28:41.139873857 +0000 UTC m=+0.071351413 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.140 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[291e279a-dbdb-449e-b031-ac1bfce08452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.161 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[30b3d773-bc58-489b-a1a7-2f1eefc9a033]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c3fc5f2-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:72:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454555, 'reachable_time': 28360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220135, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.165 182729 DEBUG nova.virt.libvirt.guest [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <nova:name>tempest-AttachInterfacesV270Test-server-1567633822</nova:name>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:28:41</nova:creationTime>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:user uuid="dfe595ef2d8b4e2fa64dbf2a2c3b64ba">tempest-AttachInterfacesV270Test-1156061049-project-member</nova:user>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:project uuid="f3faa2e2f55846f28c226341525ab1cd">tempest-AttachInterfacesV270Test-1156061049</nova:project>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:port uuid="c4cbdd98-0f34-4993-839d-1386995755ce">
Jan 22 22:28:41 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     <nova:port uuid="46a27ac7-81fb-4245-8fc9-947332632fd6">
Jan 22 22:28:41 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:28:41 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:28:41 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:28:41 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:28:41 compute-0 nova_compute[182725]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.179 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7129278a-1043-43f4-b911-b029fe2540be]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2c3fc5f2-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454573, 'tstamp': 454573}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220136, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2c3fc5f2-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454577, 'tstamp': 454577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220136, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.181 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c3fc5f2-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.182 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.183 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.184 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c3fc5f2-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.184 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.184 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2c3fc5f2-80, col_values=(('external_ids', {'iface-id': '77819b55-5285-4abb-9e9a-657dee5f7099'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:41.184 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.192 182729 DEBUG oslo_concurrency.lockutils [None req-fe552927-70b3-4f67-b358-b62545df7d03 dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "interface-9861fe27-e0f7-43cb-975c-6ff504da2c51-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.433 182729 DEBUG nova.compute.manager [req-1d4d5464-31a9-463c-9c40-13e48f9f5d94 req-74aec068-10dd-4c6b-b888-f383e2cee8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.433 182729 DEBUG oslo_concurrency.lockutils [req-1d4d5464-31a9-463c-9c40-13e48f9f5d94 req-74aec068-10dd-4c6b-b888-f383e2cee8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.434 182729 DEBUG oslo_concurrency.lockutils [req-1d4d5464-31a9-463c-9c40-13e48f9f5d94 req-74aec068-10dd-4c6b-b888-f383e2cee8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.434 182729 DEBUG oslo_concurrency.lockutils [req-1d4d5464-31a9-463c-9c40-13e48f9f5d94 req-74aec068-10dd-4c6b-b888-f383e2cee8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.434 182729 DEBUG nova.compute.manager [req-1d4d5464-31a9-463c-9c40-13e48f9f5d94 req-74aec068-10dd-4c6b-b888-f383e2cee8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] No waiting events found dispatching network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:28:41 compute-0 nova_compute[182725]: 2026-01-22 22:28:41.435 182729 WARNING nova.compute.manager [req-1d4d5464-31a9-463c-9c40-13e48f9f5d94 req-74aec068-10dd-4c6b-b888-f383e2cee8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received unexpected event network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 for instance with vm_state active and task_state None.
Jan 22 22:28:42 compute-0 nova_compute[182725]: 2026-01-22 22:28:42.121 182729 DEBUG nova.network.neutron [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updated VIF entry in instance network info cache for port 46a27ac7-81fb-4245-8fc9-947332632fd6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:28:42 compute-0 nova_compute[182725]: 2026-01-22 22:28:42.122 182729 DEBUG nova.network.neutron [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updating instance_info_cache with network_info: [{"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:42 compute-0 nova_compute[182725]: 2026-01-22 22:28:42.139 182729 DEBUG oslo_concurrency.lockutils [req-a2181e23-13cf-4237-898b-290d66a4cd74 req-69fafe7d-dc15-43ff-b9fb-9e6228a94cd9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-9861fe27-e0f7-43cb-975c-6ff504da2c51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.092 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.093 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.093 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.093 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.094 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.105 182729 INFO nova.compute.manager [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Terminating instance
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.118 182729 DEBUG nova.compute.manager [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:28:43 compute-0 kernel: tapc4cbdd98-0f (unregistering): left promiscuous mode
Jan 22 22:28:43 compute-0 NetworkManager[54954]: <info>  [1769120923.1507] device (tapc4cbdd98-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:28:43 compute-0 ovn_controller[94850]: 2026-01-22T22:28:43Z|00200|binding|INFO|Releasing lport c4cbdd98-0f34-4993-839d-1386995755ce from this chassis (sb_readonly=0)
Jan 22 22:28:43 compute-0 ovn_controller[94850]: 2026-01-22T22:28:43Z|00201|binding|INFO|Setting lport c4cbdd98-0f34-4993-839d-1386995755ce down in Southbound
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.152 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_controller[94850]: 2026-01-22T22:28:43Z|00202|binding|INFO|Removing iface tapc4cbdd98-0f ovn-installed in OVS
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.154 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.160 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:6d:75 10.100.0.10'], port_security=['fa:16:3e:d4:6d:75 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9861fe27-e0f7-43cb-975c-6ff504da2c51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f3faa2e2f55846f28c226341525ab1cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6665a7e0-3e36-4374-9049-2cd4a3bb8819', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c52297c-362e-4500-ae79-b504ff1ae574, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c4cbdd98-0f34-4993-839d-1386995755ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.161 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c4cbdd98-0f34-4993-839d-1386995755ce in datapath 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb unbound from our chassis
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.162 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.178 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 kernel: tap46a27ac7-81 (unregistering): left promiscuous mode
Jan 22 22:28:43 compute-0 podman[220138]: 2026-01-22 22:28:43.185582841 +0000 UTC m=+0.094991304 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.186 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[26e9c97b-612f-401c-8378-be7bd885f96c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 NetworkManager[54954]: <info>  [1769120923.1892] device (tap46a27ac7-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:28:43 compute-0 ovn_controller[94850]: 2026-01-22T22:28:43Z|00203|binding|INFO|Releasing lport 46a27ac7-81fb-4245-8fc9-947332632fd6 from this chassis (sb_readonly=0)
Jan 22 22:28:43 compute-0 ovn_controller[94850]: 2026-01-22T22:28:43Z|00204|binding|INFO|Setting lport 46a27ac7-81fb-4245-8fc9-947332632fd6 down in Southbound
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.199 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_controller[94850]: 2026-01-22T22:28:43Z|00205|binding|INFO|Removing iface tap46a27ac7-81 ovn-installed in OVS
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.202 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.208 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:17:98 10.100.0.5'], port_security=['fa:16:3e:ea:17:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9861fe27-e0f7-43cb-975c-6ff504da2c51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f3faa2e2f55846f28c226341525ab1cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6665a7e0-3e36-4374-9049-2cd4a3bb8819', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c52297c-362e-4500-ae79-b504ff1ae574, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=46a27ac7-81fb-4245-8fc9-947332632fd6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:28:43 compute-0 podman[220137]: 2026-01-22 22:28:43.224777036 +0000 UTC m=+0.151214291 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.226 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.229 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5995d3-75dd-46fc-8b67-38d3f10f1d49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.232 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c1652bdb-4ed3-4286-81ae-2849fcde449f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 22 22:28:43 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000046.scope: Consumed 9.550s CPU time.
Jan 22 22:28:43 compute-0 systemd-machined[154006]: Machine qemu-28-instance-00000046 terminated.
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.276 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae15d6b-ff4e-417c-84f3-93386194dc78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.296 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b13dd276-d8c6-49c8-80e7-c0c7a213a14b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c3fc5f2-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:72:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454555, 'reachable_time': 28360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220197, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.319 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c1919d72-48c3-4bbe-af9b-2b56daa62703]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2c3fc5f2-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454573, 'tstamp': 454573}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220198, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2c3fc5f2-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454577, 'tstamp': 454577}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220198, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.321 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c3fc5f2-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.336 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.337 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c3fc5f2-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.337 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.338 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2c3fc5f2-80, col_values=(('external_ids', {'iface-id': '77819b55-5285-4abb-9e9a-657dee5f7099'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.338 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.339 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 46a27ac7-81fb-4245-8fc9-947332632fd6 in datapath 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb unbound from our chassis
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.341 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.342 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2dd1fec3-e9d8-48b3-a433-13a52eb145ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.343 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb namespace which is not needed anymore
Jan 22 22:28:43 compute-0 NetworkManager[54954]: <info>  [1769120923.3454] manager: (tapc4cbdd98-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Jan 22 22:28:43 compute-0 NetworkManager[54954]: <info>  [1769120923.3617] manager: (tap46a27ac7-81): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.402 182729 DEBUG nova.compute.manager [req-8d3dd23a-4199-4bf5-9967-83c8608895e2 req-ef155718-b844-4a73-961b-bebec6eefb37 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-unplugged-c4cbdd98-0f34-4993-839d-1386995755ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.403 182729 DEBUG oslo_concurrency.lockutils [req-8d3dd23a-4199-4bf5-9967-83c8608895e2 req-ef155718-b844-4a73-961b-bebec6eefb37 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.403 182729 DEBUG oslo_concurrency.lockutils [req-8d3dd23a-4199-4bf5-9967-83c8608895e2 req-ef155718-b844-4a73-961b-bebec6eefb37 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.404 182729 DEBUG oslo_concurrency.lockutils [req-8d3dd23a-4199-4bf5-9967-83c8608895e2 req-ef155718-b844-4a73-961b-bebec6eefb37 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.404 182729 DEBUG nova.compute.manager [req-8d3dd23a-4199-4bf5-9967-83c8608895e2 req-ef155718-b844-4a73-961b-bebec6eefb37 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] No waiting events found dispatching network-vif-unplugged-c4cbdd98-0f34-4993-839d-1386995755ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.405 182729 DEBUG nova.compute.manager [req-8d3dd23a-4199-4bf5-9967-83c8608895e2 req-ef155718-b844-4a73-961b-bebec6eefb37 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-unplugged-c4cbdd98-0f34-4993-839d-1386995755ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.419 182729 INFO nova.virt.libvirt.driver [-] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Instance destroyed successfully.
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.420 182729 DEBUG nova.objects.instance [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lazy-loading 'resources' on Instance uuid 9861fe27-e0f7-43cb-975c-6ff504da2c51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.436 182729 DEBUG nova.virt.libvirt.vif [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:28:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1567633822',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1567633822',id=70,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:28:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f3faa2e2f55846f28c226341525ab1cd',ramdisk_id='',reservation_id='r-iwl13bk2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1156061049',owner_user_name='tempest-AttachInterfacesV270Test-1156061049-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:28:34Z,user_data=None,user_id='dfe595ef2d8b4e2fa64dbf2a2c3b64ba',uuid=9861fe27-e0f7-43cb-975c-6ff504da2c51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.437 182729 DEBUG nova.network.os_vif_util [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converting VIF {"id": "c4cbdd98-0f34-4993-839d-1386995755ce", "address": "fa:16:3e:d4:6d:75", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4cbdd98-0f", "ovs_interfaceid": "c4cbdd98-0f34-4993-839d-1386995755ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.438 182729 DEBUG nova.network.os_vif_util [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:6d:75,bridge_name='br-int',has_traffic_filtering=True,id=c4cbdd98-0f34-4993-839d-1386995755ce,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4cbdd98-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.438 182729 DEBUG os_vif [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:6d:75,bridge_name='br-int',has_traffic_filtering=True,id=c4cbdd98-0f34-4993-839d-1386995755ce,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4cbdd98-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.440 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.441 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4cbdd98-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.446 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.449 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.452 182729 INFO os_vif [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:6d:75,bridge_name='br-int',has_traffic_filtering=True,id=c4cbdd98-0f34-4993-839d-1386995755ce,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4cbdd98-0f')
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.453 182729 DEBUG nova.virt.libvirt.vif [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:28:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1567633822',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1567633822',id=70,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:28:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f3faa2e2f55846f28c226341525ab1cd',ramdisk_id='',reservation_id='r-iwl13bk2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1156061049',owner_user_name='tempest-AttachInterfacesV270Test-1156061049-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:28:34Z,user_data=None,user_id='dfe595ef2d8b4e2fa64dbf2a2c3b64ba',uuid=9861fe27-e0f7-43cb-975c-6ff504da2c51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.453 182729 DEBUG nova.network.os_vif_util [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converting VIF {"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.454 182729 DEBUG nova.network.os_vif_util [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:17:98,bridge_name='br-int',has_traffic_filtering=True,id=46a27ac7-81fb-4245-8fc9-947332632fd6,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46a27ac7-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.454 182729 DEBUG os_vif [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:17:98,bridge_name='br-int',has_traffic_filtering=True,id=46a27ac7-81fb-4245-8fc9-947332632fd6,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46a27ac7-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.455 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.456 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap46a27ac7-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.459 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.462 182729 INFO os_vif [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:17:98,bridge_name='br-int',has_traffic_filtering=True,id=46a27ac7-81fb-4245-8fc9-947332632fd6,network=Network(2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46a27ac7-81')
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.463 182729 INFO nova.virt.libvirt.driver [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Deleting instance files /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51_del
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.464 182729 INFO nova.virt.libvirt.driver [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Deletion of /var/lib/nova/instances/9861fe27-e0f7-43cb-975c-6ff504da2c51_del complete
Jan 22 22:28:43 compute-0 neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb[220087]: [NOTICE]   (220091) : haproxy version is 2.8.14-c23fe91
Jan 22 22:28:43 compute-0 neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb[220087]: [NOTICE]   (220091) : path to executable is /usr/sbin/haproxy
Jan 22 22:28:43 compute-0 neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb[220087]: [WARNING]  (220091) : Exiting Master process...
Jan 22 22:28:43 compute-0 neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb[220087]: [ALERT]    (220091) : Current worker (220093) exited with code 143 (Terminated)
Jan 22 22:28:43 compute-0 neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb[220087]: [WARNING]  (220091) : All workers exited. Exiting... (0)
Jan 22 22:28:43 compute-0 systemd[1]: libpod-69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8.scope: Deactivated successfully.
Jan 22 22:28:43 compute-0 podman[220248]: 2026-01-22 22:28:43.525268238 +0000 UTC m=+0.050787611 container died 69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.561 182729 DEBUG nova.compute.manager [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.562 182729 DEBUG oslo_concurrency.lockutils [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.562 182729 DEBUG oslo_concurrency.lockutils [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.563 182729 DEBUG oslo_concurrency.lockutils [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.563 182729 DEBUG nova.compute.manager [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] No waiting events found dispatching network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.564 182729 WARNING nova.compute.manager [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received unexpected event network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 for instance with vm_state active and task_state deleting.
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.564 182729 DEBUG nova.compute.manager [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-unplugged-46a27ac7-81fb-4245-8fc9-947332632fd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8-userdata-shm.mount: Deactivated successfully.
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.564 182729 DEBUG oslo_concurrency.lockutils [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.564 182729 DEBUG oslo_concurrency.lockutils [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.564 182729 DEBUG oslo_concurrency.lockutils [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.565 182729 DEBUG nova.compute.manager [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] No waiting events found dispatching network-vif-unplugged-46a27ac7-81fb-4245-8fc9-947332632fd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.565 182729 DEBUG nova.compute.manager [req-5b8d3a0d-6383-44e5-9ef5-88f3142a7060 req-b44b8793-0e6e-4935-9511-5e2b5f0e4f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-unplugged-46a27ac7-81fb-4245-8fc9-947332632fd6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.566 182729 INFO nova.compute.manager [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 22:28:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-4873aa2be51b273df6fc42df63695583856a67ba0b820790a8b55c31576683d1-merged.mount: Deactivated successfully.
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.566 182729 DEBUG oslo.service.loopingcall [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.566 182729 DEBUG nova.compute.manager [-] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.567 182729 DEBUG nova.network.neutron [-] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:28:43 compute-0 podman[220248]: 2026-01-22 22:28:43.574737084 +0000 UTC m=+0.100256457 container cleanup 69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 22:28:43 compute-0 systemd[1]: libpod-conmon-69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8.scope: Deactivated successfully.
Jan 22 22:28:43 compute-0 podman[220279]: 2026-01-22 22:28:43.651316599 +0000 UTC m=+0.051131019 container remove 69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.658 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[53d73fca-37c5-4ec9-84ce-5655c4668bf6]: (4, ('Thu Jan 22 10:28:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb (69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8)\n69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8\nThu Jan 22 10:28:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb (69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8)\n69e3cb7ce0b1a9d889aa626e8fc87438ed15a481e5902037c59302c40f544ce8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.660 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d9441283-2997-4327-88c4-1a830331445b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.661 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c3fc5f2-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.663 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 kernel: tap2c3fc5f2-80: left promiscuous mode
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.665 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.669 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1d061f-6987-4077-8b69-79d0f561af50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.679 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.687 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9782b405-ccfc-464d-a5fa-a660c2d5a5d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.688 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6a7afbac-4cdb-46d0-a7de-3a009a41c9c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.703 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[065afa7b-8d4c-4a89-886f-52f622edcae3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454547, 'reachable_time': 23587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220294, 'error': None, 'target': 'ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d2c3fc5f2\x2d81c9\x2d4d5f\x2d8905\x2df868b3a7efeb.mount: Deactivated successfully.
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.706 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:28:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:43.706 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[43568500-6524-4c5b-84c7-faf20e45fa14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:43 compute-0 nova_compute[182725]: 2026-01-22 22:28:43.831 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:44 compute-0 nova_compute[182725]: 2026-01-22 22:28:44.711 182729 DEBUG nova.compute.manager [req-94bb6477-d7fb-4891-8008-23a56f5c3848 req-ea315bde-4f15-4049-afa2-74749d8b464a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-deleted-c4cbdd98-0f34-4993-839d-1386995755ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:44 compute-0 nova_compute[182725]: 2026-01-22 22:28:44.711 182729 INFO nova.compute.manager [req-94bb6477-d7fb-4891-8008-23a56f5c3848 req-ea315bde-4f15-4049-afa2-74749d8b464a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Neutron deleted interface c4cbdd98-0f34-4993-839d-1386995755ce; detaching it from the instance and deleting it from the info cache
Jan 22 22:28:44 compute-0 nova_compute[182725]: 2026-01-22 22:28:44.712 182729 DEBUG nova.network.neutron [req-94bb6477-d7fb-4891-8008-23a56f5c3848 req-ea315bde-4f15-4049-afa2-74749d8b464a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updating instance_info_cache with network_info: [{"id": "46a27ac7-81fb-4245-8fc9-947332632fd6", "address": "fa:16:3e:ea:17:98", "network": {"id": "2c3fc5f2-81c9-4d5f-8905-f868b3a7efeb", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1756875792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f3faa2e2f55846f28c226341525ab1cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46a27ac7-81", "ovs_interfaceid": "46a27ac7-81fb-4245-8fc9-947332632fd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:44 compute-0 nova_compute[182725]: 2026-01-22 22:28:44.809 182729 DEBUG nova.compute.manager [req-94bb6477-d7fb-4891-8008-23a56f5c3848 req-ea315bde-4f15-4049-afa2-74749d8b464a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Detach interface failed, port_id=c4cbdd98-0f34-4993-839d-1386995755ce, reason: Instance 9861fe27-e0f7-43cb-975c-6ff504da2c51 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:28:44 compute-0 nova_compute[182725]: 2026-01-22 22:28:44.883 182729 DEBUG nova.network.neutron [-] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.171 182729 INFO nova.compute.manager [-] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Took 1.60 seconds to deallocate network for instance.
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.245 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.245 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.316 182729 DEBUG nova.compute.provider_tree [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.330 182729 DEBUG nova.scheduler.client.report [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.356 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.382 182729 INFO nova.scheduler.client.report [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Deleted allocations for instance 9861fe27-e0f7-43cb-975c-6ff504da2c51
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.480 182729 DEBUG oslo_concurrency.lockutils [None req-e7a34efe-f342-4bba-a970-0d1e8711b38e dfe595ef2d8b4e2fa64dbf2a2c3b64ba f3faa2e2f55846f28c226341525ab1cd - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.511 182729 DEBUG nova.compute.manager [req-3cbac2b7-b997-48b3-b7ab-46a5a1e1d258 req-18a41542-8947-4326-a6b3-59972afe8c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.511 182729 DEBUG oslo_concurrency.lockutils [req-3cbac2b7-b997-48b3-b7ab-46a5a1e1d258 req-18a41542-8947-4326-a6b3-59972afe8c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.512 182729 DEBUG oslo_concurrency.lockutils [req-3cbac2b7-b997-48b3-b7ab-46a5a1e1d258 req-18a41542-8947-4326-a6b3-59972afe8c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.512 182729 DEBUG oslo_concurrency.lockutils [req-3cbac2b7-b997-48b3-b7ab-46a5a1e1d258 req-18a41542-8947-4326-a6b3-59972afe8c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.512 182729 DEBUG nova.compute.manager [req-3cbac2b7-b997-48b3-b7ab-46a5a1e1d258 req-18a41542-8947-4326-a6b3-59972afe8c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] No waiting events found dispatching network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.512 182729 WARNING nova.compute.manager [req-3cbac2b7-b997-48b3-b7ab-46a5a1e1d258 req-18a41542-8947-4326-a6b3-59972afe8c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received unexpected event network-vif-plugged-c4cbdd98-0f34-4993-839d-1386995755ce for instance with vm_state deleted and task_state None.
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.641 182729 DEBUG nova.compute.manager [req-62004c95-e795-49da-b012-d5b06b59e0e0 req-edcdc370-3a27-4641-a8c5-c45fbdab509f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.642 182729 DEBUG oslo_concurrency.lockutils [req-62004c95-e795-49da-b012-d5b06b59e0e0 req-edcdc370-3a27-4641-a8c5-c45fbdab509f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.642 182729 DEBUG oslo_concurrency.lockutils [req-62004c95-e795-49da-b012-d5b06b59e0e0 req-edcdc370-3a27-4641-a8c5-c45fbdab509f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.642 182729 DEBUG oslo_concurrency.lockutils [req-62004c95-e795-49da-b012-d5b06b59e0e0 req-edcdc370-3a27-4641-a8c5-c45fbdab509f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9861fe27-e0f7-43cb-975c-6ff504da2c51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.642 182729 DEBUG nova.compute.manager [req-62004c95-e795-49da-b012-d5b06b59e0e0 req-edcdc370-3a27-4641-a8c5-c45fbdab509f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] No waiting events found dispatching network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:28:45 compute-0 nova_compute[182725]: 2026-01-22 22:28:45.642 182729 WARNING nova.compute.manager [req-62004c95-e795-49da-b012-d5b06b59e0e0 req-edcdc370-3a27-4641-a8c5-c45fbdab509f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received unexpected event network-vif-plugged-46a27ac7-81fb-4245-8fc9-947332632fd6 for instance with vm_state deleted and task_state None.
Jan 22 22:28:46 compute-0 nova_compute[182725]: 2026-01-22 22:28:46.912 182729 DEBUG nova.compute.manager [req-a34595c7-3e3a-42c6-90d2-6a6899b4ca5d req-082f7501-29ba-4a3f-8c10-24b61b286db6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Received event network-vif-deleted-46a27ac7-81fb-4245-8fc9-947332632fd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:47 compute-0 nova_compute[182725]: 2026-01-22 22:28:47.526 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:47.526 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:28:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:47.527 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:28:48 compute-0 nova_compute[182725]: 2026-01-22 22:28:48.457 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:48 compute-0 nova_compute[182725]: 2026-01-22 22:28:48.833 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:51 compute-0 nova_compute[182725]: 2026-01-22 22:28:51.395 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.685 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.686 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.704 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.827 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.828 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.838 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.839 182729 INFO nova.compute.claims [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.978 182729 DEBUG nova.compute.provider_tree [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:28:52 compute-0 nova_compute[182725]: 2026-01-22 22:28:52.998 182729 DEBUG nova.scheduler.client.report [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.025 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.026 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.093 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.094 182729 DEBUG nova.network.neutron [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.114 182729 INFO nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.137 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:28:53 compute-0 podman[220297]: 2026-01-22 22:28:53.151563664 +0000 UTC m=+0.070311246 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:28:53 compute-0 podman[220296]: 2026-01-22 22:28:53.173243515 +0000 UTC m=+0.094372658 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.244 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.245 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.245 182729 INFO nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Creating image(s)
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.246 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "/var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.246 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "/var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.247 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "/var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.259 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.349 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.351 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.352 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.364 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.437 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.438 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.463 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.568 182729 DEBUG nova.policy [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.812 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk 1073741824" returned: 0 in 0.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.813 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.814 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.843 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.908 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.909 182729 DEBUG nova.virt.disk.api [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Checking if we can resize image /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:28:53 compute-0 nova_compute[182725]: 2026-01-22 22:28:53.909 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.012 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.013 182729 DEBUG nova.virt.disk.api [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Cannot resize image /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.014 182729 DEBUG nova.objects.instance [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lazy-loading 'migration_context' on Instance uuid fcd5dfc9-aa45-42d6-96d8-739f7eb5504a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.032 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.033 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Ensure instance console log exists: /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.033 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.034 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.034 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:54 compute-0 nova_compute[182725]: 2026-01-22 22:28:54.491 182729 DEBUG nova.network.neutron [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Successfully created port: 44997b83-4510-4cb4-9923-c9f1eb78e769 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:28:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:55.529 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.119 182729 DEBUG nova.network.neutron [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Successfully updated port: 44997b83-4510-4cb4-9923-c9f1eb78e769 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.137 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.137 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquired lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.138 182729 DEBUG nova.network.neutron [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.246 182729 DEBUG nova.compute.manager [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-changed-44997b83-4510-4cb4-9923-c9f1eb78e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.247 182729 DEBUG nova.compute.manager [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Refreshing instance network info cache due to event network-changed-44997b83-4510-4cb4-9923-c9f1eb78e769. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.248 182729 DEBUG oslo_concurrency.lockutils [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:28:56 compute-0 nova_compute[182725]: 2026-01-22 22:28:56.323 182729 DEBUG nova.network.neutron [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:28:57 compute-0 podman[220352]: 2026-01-22 22:28:57.144581804 +0000 UTC m=+0.073278962 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.512 182729 DEBUG nova.network.neutron [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updating instance_info_cache with network_info: [{"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.532 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Releasing lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.532 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Instance network_info: |[{"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.533 182729 DEBUG oslo_concurrency.lockutils [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.533 182729 DEBUG nova.network.neutron [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Refreshing network info cache for port 44997b83-4510-4cb4-9923-c9f1eb78e769 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.537 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Start _get_guest_xml network_info=[{"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.543 182729 WARNING nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.552 182729 DEBUG nova.virt.libvirt.host [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.552 182729 DEBUG nova.virt.libvirt.host [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.557 182729 DEBUG nova.virt.libvirt.host [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.558 182729 DEBUG nova.virt.libvirt.host [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.559 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.560 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.560 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.561 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.561 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.561 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.561 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.562 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.562 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.562 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.563 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.563 182729 DEBUG nova.virt.hardware [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.569 182729 DEBUG nova.virt.libvirt.vif [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:28:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-75282839',display_name='tempest-ServerActionsTestOtherA-server-75282839',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-75282839',id=72,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkpc4JM5MIuxSEDA6kfPzyskWsK8tI2Fh/Lqyh17yIJ2pJxhfifIULVNg5h9fSbuGmGwCrb5kJWHYC3ZvkDDXwknQcbFIVzKg+3pyYtRS9H4Udhz3FXLW262IKMi1lDZQ==',key_name='tempest-keypair-191312094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='825c15e60ddd4efeb69accacdb4b129b',ramdisk_id='',reservation_id='r-204hr0tv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-658780637',owner_user_name='tempest-ServerActionsTestOtherA-658780637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:28:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fc3e5f9d1ee84e48a089c2636d28a7b0',uuid=fcd5dfc9-aa45-42d6-96d8-739f7eb5504a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.570 182729 DEBUG nova.network.os_vif_util [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converting VIF {"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.571 182729 DEBUG nova.network.os_vif_util [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:7f:c5,bridge_name='br-int',has_traffic_filtering=True,id=44997b83-4510-4cb4-9923-c9f1eb78e769,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44997b83-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.573 182729 DEBUG nova.objects.instance [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lazy-loading 'pci_devices' on Instance uuid fcd5dfc9-aa45-42d6-96d8-739f7eb5504a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.591 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <uuid>fcd5dfc9-aa45-42d6-96d8-739f7eb5504a</uuid>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <name>instance-00000048</name>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestOtherA-server-75282839</nova:name>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:28:57</nova:creationTime>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:user uuid="fc3e5f9d1ee84e48a089c2636d28a7b0">tempest-ServerActionsTestOtherA-658780637-project-member</nova:user>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:project uuid="825c15e60ddd4efeb69accacdb4b129b">tempest-ServerActionsTestOtherA-658780637</nova:project>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         <nova:port uuid="44997b83-4510-4cb4-9923-c9f1eb78e769">
Jan 22 22:28:57 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <system>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <entry name="serial">fcd5dfc9-aa45-42d6-96d8-739f7eb5504a</entry>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <entry name="uuid">fcd5dfc9-aa45-42d6-96d8-739f7eb5504a</entry>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </system>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <os>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   </os>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <features>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   </features>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.config"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:e9:7f:c5"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <target dev="tap44997b83-45"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/console.log" append="off"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <video>
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </video>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:28:57 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:28:57 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:28:57 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:28:57 compute-0 nova_compute[182725]: </domain>
Jan 22 22:28:57 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.593 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Preparing to wait for external event network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.593 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.593 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.594 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.594 182729 DEBUG nova.virt.libvirt.vif [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:28:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-75282839',display_name='tempest-ServerActionsTestOtherA-server-75282839',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-75282839',id=72,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkpc4JM5MIuxSEDA6kfPzyskWsK8tI2Fh/Lqyh17yIJ2pJxhfifIULVNg5h9fSbuGmGwCrb5kJWHYC3ZvkDDXwknQcbFIVzKg+3pyYtRS9H4Udhz3FXLW262IKMi1lDZQ==',key_name='tempest-keypair-191312094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='825c15e60ddd4efeb69accacdb4b129b',ramdisk_id='',reservation_id='r-204hr0tv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-658780637',owner_user_name='tempest-ServerActionsTestOtherA-658780637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:28:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fc3e5f9d1ee84e48a089c2636d28a7b0',uuid=fcd5dfc9-aa45-42d6-96d8-739f7eb5504a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.595 182729 DEBUG nova.network.os_vif_util [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converting VIF {"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.596 182729 DEBUG nova.network.os_vif_util [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:7f:c5,bridge_name='br-int',has_traffic_filtering=True,id=44997b83-4510-4cb4-9923-c9f1eb78e769,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44997b83-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.596 182729 DEBUG os_vif [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:7f:c5,bridge_name='br-int',has_traffic_filtering=True,id=44997b83-4510-4cb4-9923-c9f1eb78e769,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44997b83-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.599 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.599 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.601 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.607 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.607 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44997b83-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.608 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44997b83-45, col_values=(('external_ids', {'iface-id': '44997b83-4510-4cb4-9923-c9f1eb78e769', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:7f:c5', 'vm-uuid': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.612 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:28:57 compute-0 NetworkManager[54954]: <info>  [1769120937.6132] manager: (tap44997b83-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.618 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.619 182729 INFO os_vif [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:7f:c5,bridge_name='br-int',has_traffic_filtering=True,id=44997b83-4510-4cb4-9923-c9f1eb78e769,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44997b83-45')
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.921 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.922 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.923 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] No VIF found with MAC fa:16:3e:e9:7f:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:28:57 compute-0 nova_compute[182725]: 2026-01-22 22:28:57.923 182729 INFO nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Using config drive
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.370 182729 INFO nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Creating config drive at /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.config
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.384 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqmzrrptp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.418 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120923.4168832, 9861fe27-e0f7-43cb-975c-6ff504da2c51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.419 182729 INFO nova.compute.manager [-] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] VM Stopped (Lifecycle Event)
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.439 182729 DEBUG nova.compute.manager [None req-65e668f8-f1d8-4324-9bc9-3f0cf4681b46 - - - - - -] [instance: 9861fe27-e0f7-43cb-975c-6ff504da2c51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.518 182729 DEBUG oslo_concurrency.processutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqmzrrptp" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:28:58 compute-0 kernel: tap44997b83-45: entered promiscuous mode
Jan 22 22:28:58 compute-0 NetworkManager[54954]: <info>  [1769120938.6072] manager: (tap44997b83-45): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.608 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:58 compute-0 ovn_controller[94850]: 2026-01-22T22:28:58Z|00206|binding|INFO|Claiming lport 44997b83-4510-4cb4-9923-c9f1eb78e769 for this chassis.
Jan 22 22:28:58 compute-0 ovn_controller[94850]: 2026-01-22T22:28:58Z|00207|binding|INFO|44997b83-4510-4cb4-9923-c9f1eb78e769: Claiming fa:16:3e:e9:7f:c5 10.100.0.6
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.627 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:7f:c5 10.100.0.6'], port_security=['fa:16:3e:e9:7f:c5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a544701e-2e05-4802-ba07-c012963707f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '825c15e60ddd4efeb69accacdb4b129b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a38a8d4d-db17-4f3b-93ae-8cd0d57a26b6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70afdb24-37c1-41f2-9284-84cfdd4b7137, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=44997b83-4510-4cb4-9923-c9f1eb78e769) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.630 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 44997b83-4510-4cb4-9923-c9f1eb78e769 in datapath a544701e-2e05-4802-ba07-c012963707f2 bound to our chassis
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.632 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a544701e-2e05-4802-ba07-c012963707f2
Jan 22 22:28:58 compute-0 systemd-udevd[220394]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.649 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b9249a23-69b8-4e8b-888b-b4224c019bed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.650 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa544701e-21 in ovnmeta-a544701e-2e05-4802-ba07-c012963707f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.652 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa544701e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.652 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[af72afcf-e3bc-438e-8896-6fc5ea02ee8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.653 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[460c1d7d-7c3a-4feb-96fa-82a91f3e3d55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 NetworkManager[54954]: <info>  [1769120938.6559] device (tap44997b83-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:28:58 compute-0 NetworkManager[54954]: <info>  [1769120938.6565] device (tap44997b83-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.667 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[89e6aa84-24e4-48de-ab15-dd02b9798d32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 systemd-machined[154006]: New machine qemu-29-instance-00000048.
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.681 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:58 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-00000048.
Jan 22 22:28:58 compute-0 ovn_controller[94850]: 2026-01-22T22:28:58Z|00208|binding|INFO|Setting lport 44997b83-4510-4cb4-9923-c9f1eb78e769 ovn-installed in OVS
Jan 22 22:28:58 compute-0 ovn_controller[94850]: 2026-01-22T22:28:58Z|00209|binding|INFO|Setting lport 44997b83-4510-4cb4-9923-c9f1eb78e769 up in Southbound
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.687 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.694 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aa243e0d-4072-4841-b29b-9cd0d22a30eb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.736 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[af905430-7e0e-4c36-a575-50342db1c3ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.744 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[60b4914c-5e23-4dba-bbd8-179e47621373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 NetworkManager[54954]: <info>  [1769120938.7462] manager: (tapa544701e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.787 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fa64fbff-c91d-4116-8735-e8229637eeb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.793 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a82706bc-73de-4ae8-87dc-72592cf9fc2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 NetworkManager[54954]: <info>  [1769120938.8217] device (tapa544701e-20): carrier: link connected
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.831 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c03ec02c-2131-4331-a35b-58aab47832d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.837 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.858 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a8378b-901a-44cc-9bbd-08576a25e70f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa544701e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:37:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457642, 'reachable_time': 34940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220430, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.879 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[19908134-eae2-43de-9dba-f1a86c30cd63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3b:3761'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457642, 'tstamp': 457642}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220431, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.910 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d849f180-023d-4662-8533-452fc519da26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa544701e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:37:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457642, 'reachable_time': 34940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220432, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.926 182729 DEBUG nova.compute.manager [req-f98f0b3e-dc65-4e56-9fc6-4b68f0a1dcc0 req-ed041112-84c7-4584-970e-97c598a8076f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.927 182729 DEBUG oslo_concurrency.lockutils [req-f98f0b3e-dc65-4e56-9fc6-4b68f0a1dcc0 req-ed041112-84c7-4584-970e-97c598a8076f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.927 182729 DEBUG oslo_concurrency.lockutils [req-f98f0b3e-dc65-4e56-9fc6-4b68f0a1dcc0 req-ed041112-84c7-4584-970e-97c598a8076f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.928 182729 DEBUG oslo_concurrency.lockutils [req-f98f0b3e-dc65-4e56-9fc6-4b68f0a1dcc0 req-ed041112-84c7-4584-970e-97c598a8076f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:28:58 compute-0 nova_compute[182725]: 2026-01-22 22:28:58.928 182729 DEBUG nova.compute.manager [req-f98f0b3e-dc65-4e56-9fc6-4b68f0a1dcc0 req-ed041112-84c7-4584-970e-97c598a8076f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Processing event network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:28:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:58.967 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b981fbc7-88da-41fc-966a-39f6f5f57a1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.059 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ef644275-352b-4bff-b5f8-ed90d69fdd19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.061 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa544701e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.061 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.062 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa544701e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:59 compute-0 kernel: tapa544701e-20: entered promiscuous mode
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.069 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa544701e-20, col_values=(('external_ids', {'iface-id': '57be3db3-de80-45e5-a479-c3b4b4920475'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:28:59 compute-0 NetworkManager[54954]: <info>  [1769120939.0698] manager: (tapa544701e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 22 22:28:59 compute-0 ovn_controller[94850]: 2026-01-22T22:28:59Z|00210|binding|INFO|Releasing lport 57be3db3-de80-45e5-a479-c3b4b4920475 from this chassis (sb_readonly=0)
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.064 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.067 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.070 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.078 182729 DEBUG nova.network.neutron [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updated VIF entry in instance network info cache for port 44997b83-4510-4cb4-9923-c9f1eb78e769. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.079 182729 DEBUG nova.network.neutron [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updating instance_info_cache with network_info: [{"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.094 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.096 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a544701e-2e05-4802-ba07-c012963707f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a544701e-2e05-4802-ba07-c012963707f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.098 182729 DEBUG oslo_concurrency.lockutils [req-1aaf0f62-897f-4ebe-92bd-3ee294cba2a6 req-405b0f5c-4aca-44c5-82b7-e6eeeb868298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.099 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e2af7116-df4a-44db-84ad-0f158973bc1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.100 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-a544701e-2e05-4802-ba07-c012963707f2
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/a544701e-2e05-4802-ba07-c012963707f2.pid.haproxy
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID a544701e-2e05-4802-ba07-c012963707f2
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:28:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:28:59.101 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'env', 'PROCESS_TAG=haproxy-a544701e-2e05-4802-ba07-c012963707f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a544701e-2e05-4802-ba07-c012963707f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:28:59 compute-0 podman[220468]: 2026-01-22 22:28:59.564496372 +0000 UTC m=+0.073535299 container create c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.587 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.589 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120939.58904, fcd5dfc9-aa45-42d6-96d8-739f7eb5504a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.590 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] VM Started (Lifecycle Event)
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.595 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.603 182729 INFO nova.virt.libvirt.driver [-] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Instance spawned successfully.
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.604 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:28:59 compute-0 podman[220468]: 2026-01-22 22:28:59.524599339 +0000 UTC m=+0.033638366 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:28:59 compute-0 systemd[1]: Started libpod-conmon-c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f.scope.
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.626 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.633 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.638 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.639 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.639 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.640 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.640 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.641 182729 DEBUG nova.virt.libvirt.driver [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.657 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.658 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120939.5901337, fcd5dfc9-aa45-42d6-96d8-739f7eb5504a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.658 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] VM Paused (Lifecycle Event)
Jan 22 22:28:59 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:28:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bb91b3b1dc4a21125cd99b8b364a0db20849ba69ce7c1133e106595a9b074ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:28:59 compute-0 podman[220468]: 2026-01-22 22:28:59.690476352 +0000 UTC m=+0.199515279 container init c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:28:59 compute-0 podman[220468]: 2026-01-22 22:28:59.700429714 +0000 UTC m=+0.209468641 container start c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.704 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.711 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120939.5943754, fcd5dfc9-aa45-42d6-96d8-739f7eb5504a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.711 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] VM Resumed (Lifecycle Event)
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.730 182729 INFO nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Took 6.49 seconds to spawn the instance on the hypervisor.
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.731 182729 DEBUG nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.734 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.741 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:28:59 compute-0 neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2[220488]: [NOTICE]   (220492) : New worker (220494) forked
Jan 22 22:28:59 compute-0 neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2[220488]: [NOTICE]   (220492) : Loading success.
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.762 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.822 182729 INFO nova.compute.manager [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Took 7.06 seconds to build instance.
Jan 22 22:28:59 compute-0 nova_compute[182725]: 2026-01-22 22:28:59.853 182729 DEBUG oslo_concurrency.lockutils [None req-02a0d8d8-0c84-4cfd-a459-e1cd59e29b3d fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.051 182729 DEBUG nova.compute.manager [req-0b6721f3-cf49-49e6-ab5c-0605179bf884 req-dd852e97-37f5-46fd-b752-038ffbd1aa5e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.052 182729 DEBUG oslo_concurrency.lockutils [req-0b6721f3-cf49-49e6-ab5c-0605179bf884 req-dd852e97-37f5-46fd-b752-038ffbd1aa5e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.052 182729 DEBUG oslo_concurrency.lockutils [req-0b6721f3-cf49-49e6-ab5c-0605179bf884 req-dd852e97-37f5-46fd-b752-038ffbd1aa5e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.053 182729 DEBUG oslo_concurrency.lockutils [req-0b6721f3-cf49-49e6-ab5c-0605179bf884 req-dd852e97-37f5-46fd-b752-038ffbd1aa5e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.053 182729 DEBUG nova.compute.manager [req-0b6721f3-cf49-49e6-ab5c-0605179bf884 req-dd852e97-37f5-46fd-b752-038ffbd1aa5e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] No waiting events found dispatching network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.053 182729 WARNING nova.compute.manager [req-0b6721f3-cf49-49e6-ab5c-0605179bf884 req-dd852e97-37f5-46fd-b752-038ffbd1aa5e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received unexpected event network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 for instance with vm_state active and task_state None.
Jan 22 22:29:01 compute-0 NetworkManager[54954]: <info>  [1769120941.8530] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Jan 22 22:29:01 compute-0 NetworkManager[54954]: <info>  [1769120941.8540] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.872 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.970 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:01 compute-0 ovn_controller[94850]: 2026-01-22T22:29:01Z|00211|binding|INFO|Releasing lport 57be3db3-de80-45e5-a479-c3b4b4920475 from this chassis (sb_readonly=0)
Jan 22 22:29:01 compute-0 nova_compute[182725]: 2026-01-22 22:29:01.987 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:02 compute-0 nova_compute[182725]: 2026-01-22 22:29:02.226 182729 DEBUG nova.compute.manager [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-changed-44997b83-4510-4cb4-9923-c9f1eb78e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:02 compute-0 nova_compute[182725]: 2026-01-22 22:29:02.227 182729 DEBUG nova.compute.manager [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Refreshing instance network info cache due to event network-changed-44997b83-4510-4cb4-9923-c9f1eb78e769. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:29:02 compute-0 nova_compute[182725]: 2026-01-22 22:29:02.227 182729 DEBUG oslo_concurrency.lockutils [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:29:02 compute-0 nova_compute[182725]: 2026-01-22 22:29:02.227 182729 DEBUG oslo_concurrency.lockutils [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:29:02 compute-0 nova_compute[182725]: 2026-01-22 22:29:02.227 182729 DEBUG nova.network.neutron [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Refreshing network info cache for port 44997b83-4510-4cb4-9923-c9f1eb78e769 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:29:02 compute-0 nova_compute[182725]: 2026-01-22 22:29:02.611 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:02 compute-0 nova_compute[182725]: 2026-01-22 22:29:02.757 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:03 compute-0 ovn_controller[94850]: 2026-01-22T22:29:03Z|00212|binding|INFO|Releasing lport 57be3db3-de80-45e5-a479-c3b4b4920475 from this chassis (sb_readonly=0)
Jan 22 22:29:03 compute-0 nova_compute[182725]: 2026-01-22 22:29:03.353 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:03 compute-0 nova_compute[182725]: 2026-01-22 22:29:03.827 182729 DEBUG nova.network.neutron [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updated VIF entry in instance network info cache for port 44997b83-4510-4cb4-9923-c9f1eb78e769. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:29:03 compute-0 nova_compute[182725]: 2026-01-22 22:29:03.828 182729 DEBUG nova.network.neutron [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updating instance_info_cache with network_info: [{"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:03 compute-0 nova_compute[182725]: 2026-01-22 22:29:03.842 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:03 compute-0 nova_compute[182725]: 2026-01-22 22:29:03.876 182729 DEBUG oslo_concurrency.lockutils [req-3828dd71-db2b-40d1-977e-344a98122c62 req-e913942a-7a5c-42e3-ac1a-09d87ebe7714 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:29:04 compute-0 nova_compute[182725]: 2026-01-22 22:29:04.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:04 compute-0 nova_compute[182725]: 2026-01-22 22:29:04.892 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:29:07 compute-0 nova_compute[182725]: 2026-01-22 22:29:07.615 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:08 compute-0 nova_compute[182725]: 2026-01-22 22:29:08.844 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:10 compute-0 nova_compute[182725]: 2026-01-22 22:29:10.940 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:11 compute-0 ovn_controller[94850]: 2026-01-22T22:29:11Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:7f:c5 10.100.0.6
Jan 22 22:29:11 compute-0 ovn_controller[94850]: 2026-01-22T22:29:11Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:7f:c5 10.100.0.6
Jan 22 22:29:12 compute-0 podman[220517]: 2026-01-22 22:29:12.163854853 +0000 UTC m=+0.087865592 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 22 22:29:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:12.436 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:12.438 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:12.438 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:12 compute-0 nova_compute[182725]: 2026-01-22 22:29:12.619 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:12 compute-0 nova_compute[182725]: 2026-01-22 22:29:12.910 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:12 compute-0 nova_compute[182725]: 2026-01-22 22:29:12.910 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:12 compute-0 nova_compute[182725]: 2026-01-22 22:29:12.968 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:12 compute-0 nova_compute[182725]: 2026-01-22 22:29:12.969 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:12 compute-0 nova_compute[182725]: 2026-01-22 22:29:12.969 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:12 compute-0 nova_compute[182725]: 2026-01-22 22:29:12.970 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.038 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.102 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.104 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.190 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.354 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.355 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5512MB free_disk=73.3478775024414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.356 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.356 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.522 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance fcd5dfc9-aa45-42d6-96d8-739f7eb5504a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.523 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.524 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.674 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.689 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.721 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.722 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:13 compute-0 nova_compute[182725]: 2026-01-22 22:29:13.847 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:14 compute-0 podman[220546]: 2026-01-22 22:29:14.173657756 +0000 UTC m=+0.087356830 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 22:29:14 compute-0 podman[220545]: 2026-01-22 22:29:14.185556678 +0000 UTC m=+0.112535979 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 22:29:16 compute-0 nova_compute[182725]: 2026-01-22 22:29:16.701 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:16 compute-0 nova_compute[182725]: 2026-01-22 22:29:16.701 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:29:16 compute-0 nova_compute[182725]: 2026-01-22 22:29:16.701 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:29:16 compute-0 nova_compute[182725]: 2026-01-22 22:29:16.869 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:29:16 compute-0 nova_compute[182725]: 2026-01-22 22:29:16.870 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:29:16 compute-0 nova_compute[182725]: 2026-01-22 22:29:16.870 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:29:16 compute-0 nova_compute[182725]: 2026-01-22 22:29:16.870 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fcd5dfc9-aa45-42d6-96d8-739f7eb5504a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:29:17 compute-0 nova_compute[182725]: 2026-01-22 22:29:17.104 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:17 compute-0 nova_compute[182725]: 2026-01-22 22:29:17.621 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:18 compute-0 nova_compute[182725]: 2026-01-22 22:29:18.617 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updating instance_info_cache with network_info: [{"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:18 compute-0 nova_compute[182725]: 2026-01-22 22:29:18.639 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:29:18 compute-0 nova_compute[182725]: 2026-01-22 22:29:18.640 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:29:18 compute-0 nova_compute[182725]: 2026-01-22 22:29:18.642 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:18 compute-0 nova_compute[182725]: 2026-01-22 22:29:18.851 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:18 compute-0 nova_compute[182725]: 2026-01-22 22:29:18.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:19 compute-0 nova_compute[182725]: 2026-01-22 22:29:19.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:19 compute-0 nova_compute[182725]: 2026-01-22 22:29:19.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:29:19 compute-0 nova_compute[182725]: 2026-01-22 22:29:19.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:21 compute-0 nova_compute[182725]: 2026-01-22 22:29:21.905 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:22 compute-0 ovn_controller[94850]: 2026-01-22T22:29:22Z|00213|binding|INFO|Releasing lport 57be3db3-de80-45e5-a479-c3b4b4920475 from this chassis (sb_readonly=0)
Jan 22 22:29:22 compute-0 nova_compute[182725]: 2026-01-22 22:29:22.625 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:22 compute-0 nova_compute[182725]: 2026-01-22 22:29:22.702 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:22 compute-0 nova_compute[182725]: 2026-01-22 22:29:22.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:23 compute-0 nova_compute[182725]: 2026-01-22 22:29:23.853 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:24 compute-0 podman[220592]: 2026-01-22 22:29:24.147217901 +0000 UTC m=+0.067401463 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 22:29:24 compute-0 podman[220593]: 2026-01-22 22:29:24.159167934 +0000 UTC m=+0.073957719 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:29:25 compute-0 nova_compute[182725]: 2026-01-22 22:29:25.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:26 compute-0 nova_compute[182725]: 2026-01-22 22:29:26.881 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:26 compute-0 nova_compute[182725]: 2026-01-22 22:29:26.882 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:26 compute-0 nova_compute[182725]: 2026-01-22 22:29:26.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:26 compute-0 nova_compute[182725]: 2026-01-22 22:29:26.916 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.050 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.050 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.059 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.060 182729 INFO nova.compute.claims [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.230 182729 DEBUG nova.compute.provider_tree [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.250 182729 DEBUG nova.scheduler.client.report [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.289 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.290 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.369 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.369 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.392 182729 INFO nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.421 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.543 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.545 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.546 182729 INFO nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Creating image(s)
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.547 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "/var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.548 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "/var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.549 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "/var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.573 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.611 182729 DEBUG nova.policy [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3d68435ec92a4e0e900bbd275c277a15', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6721d3fe421f42c6a38a2d2e9378217a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.627 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.670 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.671 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.672 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.696 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.772 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:27 compute-0 nova_compute[182725]: 2026-01-22 22:29:27.774 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:28 compute-0 podman[220644]: 2026-01-22 22:29:28.161213343 +0000 UTC m=+0.081764228 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.182 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk 1073741824" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.184 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.185 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.263 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.265 182729 DEBUG nova.virt.disk.api [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Checking if we can resize image /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.265 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.328 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.330 182729 DEBUG nova.virt.disk.api [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Cannot resize image /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.331 182729 DEBUG nova.objects.instance [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lazy-loading 'migration_context' on Instance uuid 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.350 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.351 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Ensure instance console log exists: /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.352 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.352 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.353 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:28 compute-0 nova_compute[182725]: 2026-01-22 22:29:28.856 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:29 compute-0 nova_compute[182725]: 2026-01-22 22:29:29.162 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Successfully created port: 9f0fc2de-e685-4869-b02e-2755f938de79 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:29:29 compute-0 nova_compute[182725]: 2026-01-22 22:29:29.854 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Successfully created port: 1785457a-7bc5-41e8-9339-52af1358fb85 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:29:31 compute-0 nova_compute[182725]: 2026-01-22 22:29:31.053 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Successfully created port: bb779c0b-7bb9-4554-bb17-a824e418cd3e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:29:31 compute-0 nova_compute[182725]: 2026-01-22 22:29:31.588 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:31 compute-0 nova_compute[182725]: 2026-01-22 22:29:31.994 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Successfully updated port: 9f0fc2de-e685-4869-b02e-2755f938de79 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.126 182729 DEBUG nova.compute.manager [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-changed-9f0fc2de-e685-4869-b02e-2755f938de79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.126 182729 DEBUG nova.compute.manager [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Refreshing instance network info cache due to event network-changed-9f0fc2de-e685-4869-b02e-2755f938de79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.127 182729 DEBUG oslo_concurrency.lockutils [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.127 182729 DEBUG oslo_concurrency.lockutils [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.127 182729 DEBUG nova.network.neutron [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Refreshing network info cache for port 9f0fc2de-e685-4869-b02e-2755f938de79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.340 182729 DEBUG nova.network.neutron [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.631 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:32 compute-0 nova_compute[182725]: 2026-01-22 22:29:32.811 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Successfully updated port: 1785457a-7bc5-41e8-9339-52af1358fb85 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:29:33 compute-0 nova_compute[182725]: 2026-01-22 22:29:33.228 182729 DEBUG nova.network.neutron [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:33 compute-0 nova_compute[182725]: 2026-01-22 22:29:33.252 182729 DEBUG oslo_concurrency.lockutils [req-31c31239-7cfc-470c-b96a-c002532d6383 req-4e51a0f4-d8cf-4fd0-9792-f091f95f5477 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:29:33 compute-0 nova_compute[182725]: 2026-01-22 22:29:33.858 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:33 compute-0 nova_compute[182725]: 2026-01-22 22:29:33.892 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Successfully updated port: bb779c0b-7bb9-4554-bb17-a824e418cd3e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:29:33 compute-0 nova_compute[182725]: 2026-01-22 22:29:33.906 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:29:33 compute-0 nova_compute[182725]: 2026-01-22 22:29:33.907 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquired lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:29:33 compute-0 nova_compute[182725]: 2026-01-22 22:29:33.907 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:29:34 compute-0 nova_compute[182725]: 2026-01-22 22:29:34.119 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:29:34 compute-0 nova_compute[182725]: 2026-01-22 22:29:34.220 182729 DEBUG nova.compute.manager [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-changed-1785457a-7bc5-41e8-9339-52af1358fb85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:34 compute-0 nova_compute[182725]: 2026-01-22 22:29:34.221 182729 DEBUG nova.compute.manager [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Refreshing instance network info cache due to event network-changed-1785457a-7bc5-41e8-9339-52af1358fb85. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:29:34 compute-0 nova_compute[182725]: 2026-01-22 22:29:34.221 182729 DEBUG oslo_concurrency.lockutils [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:29:34 compute-0 nova_compute[182725]: 2026-01-22 22:29:34.943 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.460 182729 DEBUG nova.network.neutron [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updating instance_info_cache with network_info: [{"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.481 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Releasing lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.482 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Instance network_info: |[{"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.483 182729 DEBUG oslo_concurrency.lockutils [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.484 182729 DEBUG nova.network.neutron [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Refreshing network info cache for port 1785457a-7bc5-41e8-9339-52af1358fb85 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.491 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Start _get_guest_xml network_info=[{"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.499 182729 WARNING nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.505 182729 DEBUG nova.virt.libvirt.host [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.506 182729 DEBUG nova.virt.libvirt.host [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.513 182729 DEBUG nova.virt.libvirt.host [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.514 182729 DEBUG nova.virt.libvirt.host [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.515 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.515 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.516 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.516 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.516 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.516 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.517 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.517 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.517 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.517 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.518 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.518 182729 DEBUG nova.virt.hardware [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.522 182729 DEBUG nova.virt.libvirt.vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:27Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.522 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.523 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:ae:3a,bridge_name='br-int',has_traffic_filtering=True,id=9f0fc2de-e685-4869-b02e-2755f938de79,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f0fc2de-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.524 182729 DEBUG nova.virt.libvirt.vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:27Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.524 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.525 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:51:fb,bridge_name='br-int',has_traffic_filtering=True,id=1785457a-7bc5-41e8-9339-52af1358fb85,network=Network(9638a29e-60d5-4398-ba9b-5875dd746da9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1785457a-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.525 182729 DEBUG nova.virt.libvirt.vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:27Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.525 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.526 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:3e:28,bridge_name='br-int',has_traffic_filtering=True,id=bb779c0b-7bb9-4554-bb17-a824e418cd3e,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb779c0b-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.526 182729 DEBUG nova.objects.instance [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lazy-loading 'pci_devices' on Instance uuid 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.544 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <uuid>87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6</uuid>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <name>instance-0000004c</name>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersTestMultiNic-server-2079905841</nova:name>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:29:37</nova:creationTime>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:user uuid="3d68435ec92a4e0e900bbd275c277a15">tempest-ServersTestMultiNic-708515425-project-member</nova:user>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:project uuid="6721d3fe421f42c6a38a2d2e9378217a">tempest-ServersTestMultiNic-708515425</nova:project>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:port uuid="9f0fc2de-e685-4869-b02e-2755f938de79">
Jan 22 22:29:37 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:port uuid="1785457a-7bc5-41e8-9339-52af1358fb85">
Jan 22 22:29:37 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.1.210" ipVersion="4"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         <nova:port uuid="bb779c0b-7bb9-4554-bb17-a824e418cd3e">
Jan 22 22:29:37 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.44" ipVersion="4"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <system>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <entry name="serial">87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6</entry>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <entry name="uuid">87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6</entry>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </system>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <os>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   </os>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <features>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   </features>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk.config"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:ac:ae:3a"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <target dev="tap9f0fc2de-e6"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:45:51:fb"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <target dev="tap1785457a-7b"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:7c:3e:28"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <target dev="tapbb779c0b-7b"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/console.log" append="off"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <video>
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </video>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:29:37 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:29:37 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:29:37 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:29:37 compute-0 nova_compute[182725]: </domain>
Jan 22 22:29:37 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.546 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Preparing to wait for external event network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.547 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.548 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.549 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.549 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Preparing to wait for external event network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.550 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.550 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.551 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.552 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Preparing to wait for external event network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.552 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.553 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.553 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.554 182729 DEBUG nova.virt.libvirt.vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:27Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.555 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.556 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:ae:3a,bridge_name='br-int',has_traffic_filtering=True,id=9f0fc2de-e685-4869-b02e-2755f938de79,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f0fc2de-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.557 182729 DEBUG os_vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:ae:3a,bridge_name='br-int',has_traffic_filtering=True,id=9f0fc2de-e685-4869-b02e-2755f938de79,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f0fc2de-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.558 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.559 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.560 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.564 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.564 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f0fc2de-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.565 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9f0fc2de-e6, col_values=(('external_ids', {'iface-id': '9f0fc2de-e685-4869-b02e-2755f938de79', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:ae:3a', 'vm-uuid': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.567 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 NetworkManager[54954]: <info>  [1769120977.5694] manager: (tap9f0fc2de-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.572 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.580 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.582 182729 INFO os_vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:ae:3a,bridge_name='br-int',has_traffic_filtering=True,id=9f0fc2de-e685-4869-b02e-2755f938de79,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f0fc2de-e6')
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.583 182729 DEBUG nova.virt.libvirt.vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:27Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.584 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.585 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:51:fb,bridge_name='br-int',has_traffic_filtering=True,id=1785457a-7bc5-41e8-9339-52af1358fb85,network=Network(9638a29e-60d5-4398-ba9b-5875dd746da9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1785457a-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.586 182729 DEBUG os_vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:51:fb,bridge_name='br-int',has_traffic_filtering=True,id=1785457a-7bc5-41e8-9339-52af1358fb85,network=Network(9638a29e-60d5-4398-ba9b-5875dd746da9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1785457a-7b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.586 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.587 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.587 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.590 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.591 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1785457a-7b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.592 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1785457a-7b, col_values=(('external_ids', {'iface-id': '1785457a-7bc5-41e8-9339-52af1358fb85', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:51:fb', 'vm-uuid': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.594 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 NetworkManager[54954]: <info>  [1769120977.5950] manager: (tap1785457a-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.598 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.604 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.605 182729 INFO os_vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:51:fb,bridge_name='br-int',has_traffic_filtering=True,id=1785457a-7bc5-41e8-9339-52af1358fb85,network=Network(9638a29e-60d5-4398-ba9b-5875dd746da9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1785457a-7b')
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.606 182729 DEBUG nova.virt.libvirt.vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:27Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.606 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.607 182729 DEBUG nova.network.os_vif_util [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:3e:28,bridge_name='br-int',has_traffic_filtering=True,id=bb779c0b-7bb9-4554-bb17-a824e418cd3e,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb779c0b-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.607 182729 DEBUG os_vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:3e:28,bridge_name='br-int',has_traffic_filtering=True,id=bb779c0b-7bb9-4554-bb17-a824e418cd3e,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb779c0b-7b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.608 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.608 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.609 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.611 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.611 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb779c0b-7b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.612 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb779c0b-7b, col_values=(('external_ids', {'iface-id': 'bb779c0b-7bb9-4554-bb17-a824e418cd3e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:3e:28', 'vm-uuid': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.614 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 NetworkManager[54954]: <info>  [1769120977.6150] manager: (tapbb779c0b-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.617 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.627 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.630 182729 INFO os_vif [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:3e:28,bridge_name='br-int',has_traffic_filtering=True,id=bb779c0b-7bb9-4554-bb17-a824e418cd3e,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb779c0b-7b')
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.687 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.688 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.688 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] No VIF found with MAC fa:16:3e:ac:ae:3a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.688 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] No VIF found with MAC fa:16:3e:45:51:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.688 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] No VIF found with MAC fa:16:3e:7c:3e:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:29:37 compute-0 nova_compute[182725]: 2026-01-22 22:29:37.689 182729 INFO nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Using config drive
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.162 182729 INFO nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Creating config drive at /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk.config
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.172 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmb3w93r2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.319 182729 DEBUG oslo_concurrency.processutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmb3w93r2" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4091] manager: (tap9f0fc2de-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Jan 22 22:29:38 compute-0 kernel: tap9f0fc2de-e6: entered promiscuous mode
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00214|binding|INFO|Claiming lport 9f0fc2de-e685-4869-b02e-2755f938de79 for this chassis.
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00215|binding|INFO|9f0fc2de-e685-4869-b02e-2755f938de79: Claiming fa:16:3e:ac:ae:3a 10.100.0.21
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.425 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4315] manager: (tap1785457a-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.435 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:ae:3a 10.100.0.21'], port_security=['fa:16:3e:ac:ae:3a 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/24', 'neutron:device_id': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6721d3fe421f42c6a38a2d2e9378217a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e3e1c4ea-635d-4aec-bf70-af57c2a58226', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6ff6643f-512c-4ca7-acea-e4fbb1fa2234, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9f0fc2de-e685-4869-b02e-2755f938de79) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.436 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9f0fc2de-e685-4869-b02e-2755f938de79 in datapath 3c6383b5-4f25-45b9-ae16-88694b1c61a8 bound to our chassis
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.437 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c6383b5-4f25-45b9-ae16-88694b1c61a8
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.457 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b3897db6-f87f-4cdb-a1dd-923b7fc904ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.458 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c6383b5-41 in ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:29:38 compute-0 systemd-udevd[220704]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:29:38 compute-0 systemd-udevd[220703]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4633] manager: (tapbb779c0b-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.462 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c6383b5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.462 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9e524f3d-c777-4ffc-93ee-4ac3bd7a8bb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 systemd-udevd[220705]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.464 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ade15394-2777-4d20-a39d-0f3d32cb3c8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 kernel: tapbb779c0b-7b: entered promiscuous mode
Jan 22 22:29:38 compute-0 kernel: tap1785457a-7b: entered promiscuous mode
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4829] device (tap9f0fc2de-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4854] device (tap1785457a-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00216|binding|INFO|Claiming lport bb779c0b-7bb9-4554-bb17-a824e418cd3e for this chassis.
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00217|binding|INFO|bb779c0b-7bb9-4554-bb17-a824e418cd3e: Claiming fa:16:3e:7c:3e:28 10.100.0.44
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00218|binding|INFO|Claiming lport 1785457a-7bc5-41e8-9339-52af1358fb85 for this chassis.
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00219|binding|INFO|1785457a-7bc5-41e8-9339-52af1358fb85: Claiming fa:16:3e:45:51:fb 10.100.1.210
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4908] device (tap9f0fc2de-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.490 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4928] device (tap1785457a-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.492 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4937] device (tapbb779c0b-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.4949] device (tapbb779c0b-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00220|binding|INFO|Setting lport 9f0fc2de-e685-4869-b02e-2755f938de79 ovn-installed in OVS
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.496 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00221|binding|INFO|Setting lport 9f0fc2de-e685-4869-b02e-2755f938de79 up in Southbound
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.502 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:3e:28 10.100.0.44'], port_security=['fa:16:3e:7c:3e:28 10.100.0.44'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.44/24', 'neutron:device_id': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6721d3fe421f42c6a38a2d2e9378217a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e3e1c4ea-635d-4aec-bf70-af57c2a58226', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6ff6643f-512c-4ca7-acea-e4fbb1fa2234, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=bb779c0b-7bb9-4554-bb17-a824e418cd3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.506 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:51:fb 10.100.1.210'], port_security=['fa:16:3e:45:51:fb 10.100.1.210'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.210/24', 'neutron:device_id': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9638a29e-60d5-4398-ba9b-5875dd746da9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6721d3fe421f42c6a38a2d2e9378217a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e3e1c4ea-635d-4aec-bf70-af57c2a58226', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2edc5ce5-ed60-4c34-ac19-45d7b3d2ab93, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=1785457a-7bc5-41e8-9339-52af1358fb85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.501 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[0c2e5469-bc36-46f4-8ee3-47d57e6cd7fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 systemd-machined[154006]: New machine qemu-30-instance-0000004c.
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00222|binding|INFO|Setting lport 1785457a-7bc5-41e8-9339-52af1358fb85 ovn-installed in OVS
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00223|binding|INFO|Setting lport 1785457a-7bc5-41e8-9339-52af1358fb85 up in Southbound
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00224|binding|INFO|Setting lport bb779c0b-7bb9-4554-bb17-a824e418cd3e ovn-installed in OVS
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00225|binding|INFO|Setting lport bb779c0b-7bb9-4554-bb17-a824e418cd3e up in Southbound
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.537 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[63cda749-3c94-4d34-b265-d361517acb6b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.537 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-0000004c.
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.573 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[35dedb2f-6d24-4b1c-aba2-d9ba5022bba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.5853] manager: (tap3c6383b5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/117)
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.586 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[baf0e64a-c9df-4842-9b7b-2f7be5a8eb20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.625 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8302cf-5524-4361-bd4e-f4ffb2f8fce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.629 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[849b2d3b-4665-4a26-99d0-d6aa00e12d2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.6562] device (tap3c6383b5-40): carrier: link connected
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.660 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3e61dce2-b4cb-4cc3-a399-32a0669e5ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.682 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4b40d818-61a8-4cf4-99eb-71d221ee7538]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c6383b5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:26:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461625, 'reachable_time': 39572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220741, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.699 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d764aa27-3e63-4a72-b00f-8c9c69f5f462]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:26c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461625, 'tstamp': 461625}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220742, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.717 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[86cc0140-4de1-4ae6-8c60-53013d2f8268]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c6383b5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:26:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461625, 'reachable_time': 39572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220743, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.753 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b2670e33-8324-4c6a-994e-27065ded7f52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.834 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[223f2ae3-615a-47d3-9f3e-d54f182e72ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.836 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c6383b5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.837 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.837 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c6383b5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.839 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 NetworkManager[54954]: <info>  [1769120978.8410] manager: (tap3c6383b5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Jan 22 22:29:38 compute-0 kernel: tap3c6383b5-40: entered promiscuous mode
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.843 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.846 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c6383b5-40, col_values=(('external_ids', {'iface-id': '20eda616-4760-434c-b9b0-ba649bdfc9d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.847 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 ovn_controller[94850]: 2026-01-22T22:29:38Z|00226|binding|INFO|Releasing lport 20eda616-4760-434c-b9b0-ba649bdfc9d7 from this chassis (sb_readonly=0)
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.866 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 nova_compute[182725]: 2026-01-22 22:29:38.870 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.871 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c6383b5-4f25-45b9-ae16-88694b1c61a8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c6383b5-4f25-45b9-ae16-88694b1c61a8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.872 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1e82c250-490e-4073-9b66-954af56932db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.873 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-3c6383b5-4f25-45b9-ae16-88694b1c61a8
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/3c6383b5-4f25-45b9-ae16-88694b1c61a8.pid.haproxy
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 3c6383b5-4f25-45b9-ae16-88694b1c61a8
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:29:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:38.874 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'env', 'PROCESS_TAG=haproxy-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c6383b5-4f25-45b9-ae16-88694b1c61a8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.117 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120979.1166837, 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.118 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] VM Started (Lifecycle Event)
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.140 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.145 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120979.1168463, 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.146 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] VM Paused (Lifecycle Event)
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.164 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.168 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.186 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:29:39 compute-0 podman[220784]: 2026-01-22 22:29:39.302350871 +0000 UTC m=+0.032045125 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:29:39 compute-0 podman[220784]: 2026-01-22 22:29:39.398884783 +0000 UTC m=+0.128578947 container create 5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:29:39 compute-0 systemd[1]: Started libpod-conmon-5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8.scope.
Jan 22 22:29:39 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:29:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a840841dc747cd8f385912d50251127686d26eaccdd4eca3b3670eed95f11a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:29:39 compute-0 podman[220784]: 2026-01-22 22:29:39.589518373 +0000 UTC m=+0.319212627 container init 5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:29:39 compute-0 podman[220784]: 2026-01-22 22:29:39.601336373 +0000 UTC m=+0.331030537 container start 5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 22:29:39 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [NOTICE]   (220803) : New worker (220805) forked
Jan 22 22:29:39 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [NOTICE]   (220803) : Loading success.
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.698 182729 DEBUG nova.compute.manager [req-98f8cdf6-d2a7-495f-92e6-84c8045a6228 req-3ebccf7b-c1ca-42b5-9ee4-fe35ecaf3bfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.699 182729 DEBUG oslo_concurrency.lockutils [req-98f8cdf6-d2a7-495f-92e6-84c8045a6228 req-3ebccf7b-c1ca-42b5-9ee4-fe35ecaf3bfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.700 182729 DEBUG oslo_concurrency.lockutils [req-98f8cdf6-d2a7-495f-92e6-84c8045a6228 req-3ebccf7b-c1ca-42b5-9ee4-fe35ecaf3bfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.700 182729 DEBUG oslo_concurrency.lockutils [req-98f8cdf6-d2a7-495f-92e6-84c8045a6228 req-3ebccf7b-c1ca-42b5-9ee4-fe35ecaf3bfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.700 182729 DEBUG nova.compute.manager [req-98f8cdf6-d2a7-495f-92e6-84c8045a6228 req-3ebccf7b-c1ca-42b5-9ee4-fe35ecaf3bfe 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Processing event network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.701 104215 INFO neutron.agent.ovn.metadata.agent [-] Port bb779c0b-7bb9-4554-bb17-a824e418cd3e in datapath 3c6383b5-4f25-45b9-ae16-88694b1c61a8 unbound from our chassis
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.704 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c6383b5-4f25-45b9-ae16-88694b1c61a8
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.724 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8562ed82-0743-47af-8c71-eb0bfc1aacd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.766 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[0f989e28-a490-4319-8c63-0d1444e77bb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.772 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a283b8f3-6421-44c1-98b3-f54257449883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.822 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[141be9af-6cfa-497c-9c90-8f7d11124180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.844 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e61a12ed-c94e-453a-a2b2-e20c89fa185d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c6383b5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:26:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461625, 'reachable_time': 39572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220819, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.869 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[52d13a77-435d-40be-9808-ebdeda170c4e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3c6383b5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461638, 'tstamp': 461638}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220820, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap3c6383b5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461642, 'tstamp': 461642}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220820, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.873 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c6383b5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:39 compute-0 nova_compute[182725]: 2026-01-22 22:29:39.876 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.878 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c6383b5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.879 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.880 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c6383b5-40, col_values=(('external_ids', {'iface-id': '20eda616-4760-434c-b9b0-ba649bdfc9d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.880 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.882 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 1785457a-7bc5-41e8-9339-52af1358fb85 in datapath 9638a29e-60d5-4398-ba9b-5875dd746da9 unbound from our chassis
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.885 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9638a29e-60d5-4398-ba9b-5875dd746da9
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.900 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[555131dd-c98f-4491-af0c-ccd99209723f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.901 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9638a29e-61 in ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.904 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9638a29e-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.904 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[91335591-0619-49ba-951b-f8a2307ab091]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.905 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[72ae78bd-0867-49e4-9787-9cfde6dbb3ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.921 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7c7d44-97ab-4df3-996d-44160bc8fabd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.941 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f91cd8fb-8df3-4040-a6f4-f866b53a6fa1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.981 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf6481c-0003-41b9-bd30-6ccfd9039990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:39.989 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[34925aee-27e4-467d-b117-1349aa790d9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:39 compute-0 NetworkManager[54954]: <info>  [1769120979.9903] manager: (tap9638a29e-60): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Jan 22 22:29:39 compute-0 systemd-udevd[220736]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.030 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d523a9a9-6101-4125-9e76-0d2117ad7061]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.033 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f0a0c9-3883-4cc6-bad9-1218cbc3a832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 NetworkManager[54954]: <info>  [1769120980.0643] device (tap9638a29e-60): carrier: link connected
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.073 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1487ca44-6ad0-4adc-b898-e06d4ce9e8b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.097 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb84932-8cc4-4093-a1d3-f0baf21a035d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9638a29e-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:52:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461766, 'reachable_time': 44526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220831, 'error': None, 'target': 'ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.117 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[153e285f-5db6-406c-8ac4-aab2a2b98048]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:522b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461766, 'tstamp': 461766}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220832, 'error': None, 'target': 'ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.140 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1185ffbd-44d1-4043-9727-5f72d9c5ba76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9638a29e-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:52:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461766, 'reachable_time': 44526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220833, 'error': None, 'target': 'ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.169 182729 DEBUG nova.network.neutron [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updated VIF entry in instance network info cache for port 1785457a-7bc5-41e8-9339-52af1358fb85. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.170 182729 DEBUG nova.network.neutron [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updating instance_info_cache with network_info: [{"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.181 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3376fe71-1d20-4e7f-bfca-39328077a117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.186 182729 DEBUG oslo_concurrency.lockutils [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.187 182729 DEBUG nova.compute.manager [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-changed-bb779c0b-7bb9-4554-bb17-a824e418cd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.187 182729 DEBUG nova.compute.manager [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Refreshing instance network info cache due to event network-changed-bb779c0b-7bb9-4554-bb17-a824e418cd3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.187 182729 DEBUG oslo_concurrency.lockutils [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.187 182729 DEBUG oslo_concurrency.lockutils [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.188 182729 DEBUG nova.network.neutron [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Refreshing network info cache for port bb779c0b-7bb9-4554-bb17-a824e418cd3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.257 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[084599b5-d813-4519-b424-902a23538d73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.259 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9638a29e-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.259 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.259 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9638a29e-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.261 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:40 compute-0 kernel: tap9638a29e-60: entered promiscuous mode
Jan 22 22:29:40 compute-0 NetworkManager[54954]: <info>  [1769120980.2629] manager: (tap9638a29e-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.264 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.266 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9638a29e-60, col_values=(('external_ids', {'iface-id': '49a939f5-0556-45c1-a004-a15ffeb9c4f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.268 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:40 compute-0 ovn_controller[94850]: 2026-01-22T22:29:40Z|00227|binding|INFO|Releasing lport 49a939f5-0556-45c1-a004-a15ffeb9c4f0 from this chassis (sb_readonly=0)
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.270 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9638a29e-60d5-4398-ba9b-5875dd746da9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9638a29e-60d5-4398-ba9b-5875dd746da9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.271 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[33f9f737-6468-4101-a1ef-8656b016790e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.272 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-9638a29e-60d5-4398-ba9b-5875dd746da9
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/9638a29e-60d5-4398-ba9b-5875dd746da9.pid.haproxy
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 9638a29e-60d5-4398-ba9b-5875dd746da9
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:29:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:40.273 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9', 'env', 'PROCESS_TAG=haproxy-9638a29e-60d5-4398-ba9b-5875dd746da9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9638a29e-60d5-4398-ba9b-5875dd746da9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.280 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.374 182729 DEBUG nova.compute.manager [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.375 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.375 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.376 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.376 182729 DEBUG nova.compute.manager [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Processing event network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.376 182729 DEBUG nova.compute.manager [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.377 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.377 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.378 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.378 182729 DEBUG nova.compute.manager [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No event matching network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e in dict_keys([('network-vif-plugged', '1785457a-7bc5-41e8-9339-52af1358fb85')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.379 182729 WARNING nova.compute.manager [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received unexpected event network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e for instance with vm_state building and task_state spawning.
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.380 182729 DEBUG nova.compute.manager [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.380 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.381 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.381 182729 DEBUG oslo_concurrency.lockutils [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.382 182729 DEBUG nova.compute.manager [req-ebf449b7-c29e-4fab-9cfc-ac87304e00c2 req-6afc1689-5304-43dd-aec9-22bc8667b0cf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Processing event network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.383 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.388 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769120980.388581, 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.389 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] VM Resumed (Lifecycle Event)
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.398 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.404 182729 INFO nova.virt.libvirt.driver [-] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Instance spawned successfully.
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.405 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.420 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.431 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.445 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.445 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.446 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.447 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.448 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.449 182729 DEBUG nova.virt.libvirt.driver [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.460 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.545 182729 INFO nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Took 13.00 seconds to spawn the instance on the hypervisor.
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.547 182729 DEBUG nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.627 182729 INFO nova.compute.manager [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Took 13.63 seconds to build instance.
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.645 182729 DEBUG oslo_concurrency.lockutils [None req-d4b62dae-85de-4de1-b550-44550b634213 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.764s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:40 compute-0 podman[220864]: 2026-01-22 22:29:40.648761955 +0000 UTC m=+0.060438456 container create 5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:29:40 compute-0 systemd[1]: Started libpod-conmon-5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c.scope.
Jan 22 22:29:40 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:29:40 compute-0 podman[220864]: 2026-01-22 22:29:40.614906835 +0000 UTC m=+0.026583376 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:29:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a7e496fbc69eac12ecc12952adec6a37f91062b55876a3836d609ba0b4e7cd8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:29:40 compute-0 podman[220864]: 2026-01-22 22:29:40.870716682 +0000 UTC m=+0.282393213 container init 5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:29:40 compute-0 podman[220864]: 2026-01-22 22:29:40.883126627 +0000 UTC m=+0.294803128 container start 5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:29:40 compute-0 nova_compute[182725]: 2026-01-22 22:29:40.905 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:29:40 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [NOTICE]   (220883) : New worker (220885) forked
Jan 22 22:29:40 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [NOTICE]   (220883) : Loading success.
Jan 22 22:29:41 compute-0 nova_compute[182725]: 2026-01-22 22:29:41.791 182729 DEBUG nova.compute.manager [req-e98dc5e7-9443-4bf8-a5b6-e3015fd956c1 req-cf43ef5d-fa55-4a40-871c-b6de80dce00f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:41 compute-0 nova_compute[182725]: 2026-01-22 22:29:41.793 182729 DEBUG oslo_concurrency.lockutils [req-e98dc5e7-9443-4bf8-a5b6-e3015fd956c1 req-cf43ef5d-fa55-4a40-871c-b6de80dce00f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:41 compute-0 nova_compute[182725]: 2026-01-22 22:29:41.794 182729 DEBUG oslo_concurrency.lockutils [req-e98dc5e7-9443-4bf8-a5b6-e3015fd956c1 req-cf43ef5d-fa55-4a40-871c-b6de80dce00f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:41 compute-0 nova_compute[182725]: 2026-01-22 22:29:41.794 182729 DEBUG oslo_concurrency.lockutils [req-e98dc5e7-9443-4bf8-a5b6-e3015fd956c1 req-cf43ef5d-fa55-4a40-871c-b6de80dce00f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:41 compute-0 nova_compute[182725]: 2026-01-22 22:29:41.795 182729 DEBUG nova.compute.manager [req-e98dc5e7-9443-4bf8-a5b6-e3015fd956c1 req-cf43ef5d-fa55-4a40-871c-b6de80dce00f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:41 compute-0 nova_compute[182725]: 2026-01-22 22:29:41.796 182729 WARNING nova.compute.manager [req-e98dc5e7-9443-4bf8-a5b6-e3015fd956c1 req-cf43ef5d-fa55-4a40-871c-b6de80dce00f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received unexpected event network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 for instance with vm_state active and task_state None.
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.382 182729 DEBUG nova.network.neutron [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updated VIF entry in instance network info cache for port bb779c0b-7bb9-4554-bb17-a824e418cd3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.384 182729 DEBUG nova.network.neutron [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updating instance_info_cache with network_info: [{"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.428 182729 DEBUG oslo_concurrency.lockutils [req-c42d69e6-1484-4bf2-acd8-ffc190d81c12 req-bce0a8f9-6fdc-4632-8d6a-ebc7e00d9cb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.531 182729 DEBUG nova.compute.manager [req-79842f8e-ec01-4e73-b7b6-12356f5f04af req-66352e49-a854-4600-96c3-948b6ac9de99 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.532 182729 DEBUG oslo_concurrency.lockutils [req-79842f8e-ec01-4e73-b7b6-12356f5f04af req-66352e49-a854-4600-96c3-948b6ac9de99 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.532 182729 DEBUG oslo_concurrency.lockutils [req-79842f8e-ec01-4e73-b7b6-12356f5f04af req-66352e49-a854-4600-96c3-948b6ac9de99 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.533 182729 DEBUG oslo_concurrency.lockutils [req-79842f8e-ec01-4e73-b7b6-12356f5f04af req-66352e49-a854-4600-96c3-948b6ac9de99 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.534 182729 DEBUG nova.compute.manager [req-79842f8e-ec01-4e73-b7b6-12356f5f04af req-66352e49-a854-4600-96c3-948b6ac9de99 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.534 182729 WARNING nova.compute.manager [req-79842f8e-ec01-4e73-b7b6-12356f5f04af req-66352e49-a854-4600-96c3-948b6ac9de99 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received unexpected event network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 for instance with vm_state active and task_state None.
Jan 22 22:29:42 compute-0 nova_compute[182725]: 2026-01-22 22:29:42.615 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 podman[220894]: 2026-01-22 22:29:43.169839341 +0000 UTC m=+0.088687292 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.813 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.815 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.815 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.816 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.817 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.833 182729 INFO nova.compute.manager [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Terminating instance
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.848 182729 DEBUG nova.compute.manager [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.872 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 kernel: tap9f0fc2de-e6 (unregistering): left promiscuous mode
Jan 22 22:29:43 compute-0 NetworkManager[54954]: <info>  [1769120983.8788] device (tap9f0fc2de-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.890 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00228|binding|INFO|Releasing lport 9f0fc2de-e685-4869-b02e-2755f938de79 from this chassis (sb_readonly=0)
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00229|binding|INFO|Setting lport 9f0fc2de-e685-4869-b02e-2755f938de79 down in Southbound
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00230|binding|INFO|Removing iface tap9f0fc2de-e6 ovn-installed in OVS
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.894 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.907 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 kernel: tap1785457a-7b (unregistering): left promiscuous mode
Jan 22 22:29:43 compute-0 NetworkManager[54954]: <info>  [1769120983.9143] device (tap1785457a-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:29:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:43.930 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:ae:3a 10.100.0.21'], port_security=['fa:16:3e:ac:ae:3a 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/24', 'neutron:device_id': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6721d3fe421f42c6a38a2d2e9378217a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e3e1c4ea-635d-4aec-bf70-af57c2a58226', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6ff6643f-512c-4ca7-acea-e4fbb1fa2234, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9f0fc2de-e685-4869-b02e-2755f938de79) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:29:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:43.932 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9f0fc2de-e685-4869-b02e-2755f938de79 in datapath 3c6383b5-4f25-45b9-ae16-88694b1c61a8 unbound from our chassis
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00231|binding|INFO|Releasing lport 1785457a-7bc5-41e8-9339-52af1358fb85 from this chassis (sb_readonly=0)
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00232|binding|INFO|Setting lport 1785457a-7bc5-41e8-9339-52af1358fb85 down in Southbound
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00233|binding|INFO|Removing iface tap1785457a-7b ovn-installed in OVS
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.954 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:43.953 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c6383b5-4f25-45b9-ae16-88694b1c61a8
Jan 22 22:29:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:43.964 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:51:fb 10.100.1.210'], port_security=['fa:16:3e:45:51:fb 10.100.1.210'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.210/24', 'neutron:device_id': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9638a29e-60d5-4398-ba9b-5875dd746da9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6721d3fe421f42c6a38a2d2e9378217a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e3e1c4ea-635d-4aec-bf70-af57c2a58226', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2edc5ce5-ed60-4c34-ac19-45d7b3d2ab93, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=1785457a-7bc5-41e8-9339-52af1358fb85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:29:43 compute-0 kernel: tapbb779c0b-7b (unregistering): left promiscuous mode
Jan 22 22:29:43 compute-0 NetworkManager[54954]: <info>  [1769120983.9779] device (tapbb779c0b-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:29:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:43.986 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7fffeda6-d1d7-4b33-b5a8-f2855624a31e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00234|binding|INFO|Releasing lport bb779c0b-7bb9-4554-bb17-a824e418cd3e from this chassis (sb_readonly=0)
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00235|binding|INFO|Setting lport bb779c0b-7bb9-4554-bb17-a824e418cd3e down in Southbound
Jan 22 22:29:43 compute-0 ovn_controller[94850]: 2026-01-22T22:29:43Z|00236|binding|INFO|Removing iface tapbb779c0b-7b ovn-installed in OVS
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.992 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:43 compute-0 nova_compute[182725]: 2026-01-22 22:29:43.994 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.006 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:3e:28 10.100.0.44'], port_security=['fa:16:3e:7c:3e:28 10.100.0.44'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.44/24', 'neutron:device_id': '87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6721d3fe421f42c6a38a2d2e9378217a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e3e1c4ea-635d-4aec-bf70-af57c2a58226', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6ff6643f-512c-4ca7-acea-e4fbb1fa2234, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=bb779c0b-7bb9-4554-bb17-a824e418cd3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.024 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.029 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[01701282-27b1-4665-b054-b411ed9f2c5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.033 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[36952f8f-4312-4c58-967f-51f9df12fd39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Jan 22 22:29:44 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004c.scope: Consumed 4.080s CPU time.
Jan 22 22:29:44 compute-0 systemd-machined[154006]: Machine qemu-30-instance-0000004c terminated.
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.073 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6f95e9c4-420c-48ab-aca8-7571622e8f50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 NetworkManager[54954]: <info>  [1769120984.0908] manager: (tapbb779c0b-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/121)
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.098 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[59c6ad0c-fdc2-414b-9d07-d98a3d70ff3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c6383b5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:26:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461625, 'reachable_time': 39572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220949, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.121 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc992d6-5c07-4916-8502-3f34eecc4d9c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3c6383b5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461638, 'tstamp': 461638}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220971, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap3c6383b5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461642, 'tstamp': 461642}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220971, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.123 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c6383b5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.124 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.136 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.140 182729 INFO nova.virt.libvirt.driver [-] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Instance destroyed successfully.
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.139 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c6383b5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.140 182729 DEBUG nova.objects.instance [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lazy-loading 'resources' on Instance uuid 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.140 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.141 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c6383b5-40, col_values=(('external_ids', {'iface-id': '20eda616-4760-434c-b9b0-ba649bdfc9d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.141 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.144 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 1785457a-7bc5-41e8-9339-52af1358fb85 in datapath 9638a29e-60d5-4398-ba9b-5875dd746da9 unbound from our chassis
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.147 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9638a29e-60d5-4398-ba9b-5875dd746da9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.148 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f010ef05-b79c-4a76-9c8d-56db61c7a8f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.149 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9 namespace which is not needed anymore
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.172 182729 DEBUG nova.virt.libvirt.vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:29:40Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.173 182729 DEBUG nova.network.os_vif_util [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.174 182729 DEBUG nova.network.os_vif_util [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:ae:3a,bridge_name='br-int',has_traffic_filtering=True,id=9f0fc2de-e685-4869-b02e-2755f938de79,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f0fc2de-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.174 182729 DEBUG os_vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:ae:3a,bridge_name='br-int',has_traffic_filtering=True,id=9f0fc2de-e685-4869-b02e-2755f938de79,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f0fc2de-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.176 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.177 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f0fc2de-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.179 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.181 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.188 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.191 182729 INFO os_vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:ae:3a,bridge_name='br-int',has_traffic_filtering=True,id=9f0fc2de-e685-4869-b02e-2755f938de79,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f0fc2de-e6')
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.192 182729 DEBUG nova.virt.libvirt.vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:29:40Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.193 182729 DEBUG nova.network.os_vif_util [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "1785457a-7bc5-41e8-9339-52af1358fb85", "address": "fa:16:3e:45:51:fb", "network": {"id": "9638a29e-60d5-4398-ba9b-5875dd746da9", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2077878946", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1785457a-7b", "ovs_interfaceid": "1785457a-7bc5-41e8-9339-52af1358fb85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.193 182729 DEBUG nova.network.os_vif_util [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:51:fb,bridge_name='br-int',has_traffic_filtering=True,id=1785457a-7bc5-41e8-9339-52af1358fb85,network=Network(9638a29e-60d5-4398-ba9b-5875dd746da9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1785457a-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.194 182729 DEBUG os_vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:51:fb,bridge_name='br-int',has_traffic_filtering=True,id=1785457a-7bc5-41e8-9339-52af1358fb85,network=Network(9638a29e-60d5-4398-ba9b-5875dd746da9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1785457a-7b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.195 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.195 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1785457a-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.197 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.198 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.201 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.203 182729 INFO os_vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:51:fb,bridge_name='br-int',has_traffic_filtering=True,id=1785457a-7bc5-41e8-9339-52af1358fb85,network=Network(9638a29e-60d5-4398-ba9b-5875dd746da9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1785457a-7b')
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.204 182729 DEBUG nova.virt.libvirt.vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:29:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2079905841',display_name='tempest-ServersTestMultiNic-server-2079905841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2079905841',id=76,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6721d3fe421f42c6a38a2d2e9378217a',ramdisk_id='',reservation_id='r-pglz5d8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-708515425',owner_user_name='tempest-ServersTestMultiNic-708515425-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:29:40Z,user_data=None,user_id='3d68435ec92a4e0e900bbd275c277a15',uuid=87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.204 182729 DEBUG nova.network.os_vif_util [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converting VIF {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.204 182729 DEBUG nova.network.os_vif_util [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:3e:28,bridge_name='br-int',has_traffic_filtering=True,id=bb779c0b-7bb9-4554-bb17-a824e418cd3e,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb779c0b-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.205 182729 DEBUG os_vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:3e:28,bridge_name='br-int',has_traffic_filtering=True,id=bb779c0b-7bb9-4554-bb17-a824e418cd3e,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb779c0b-7b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.206 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb779c0b-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.207 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.210 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.214 182729 INFO os_vif [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:3e:28,bridge_name='br-int',has_traffic_filtering=True,id=bb779c0b-7bb9-4554-bb17-a824e418cd3e,network=Network(3c6383b5-4f25-45b9-ae16-88694b1c61a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb779c0b-7b')
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.215 182729 INFO nova.virt.libvirt.driver [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Deleting instance files /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6_del
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.216 182729 INFO nova.virt.libvirt.driver [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Deletion of /var/lib/nova/instances/87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6_del complete
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.298 182729 INFO nova.compute.manager [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.299 182729 DEBUG oslo.service.loopingcall [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.299 182729 DEBUG nova.compute.manager [-] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.300 182729 DEBUG nova.network.neutron [-] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [NOTICE]   (220883) : haproxy version is 2.8.14-c23fe91
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [NOTICE]   (220883) : path to executable is /usr/sbin/haproxy
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [WARNING]  (220883) : Exiting Master process...
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [WARNING]  (220883) : Exiting Master process...
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [ALERT]    (220883) : Current worker (220885) exited with code 143 (Terminated)
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9[220879]: [WARNING]  (220883) : All workers exited. Exiting... (0)
Jan 22 22:29:44 compute-0 systemd[1]: libpod-5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c.scope: Deactivated successfully.
Jan 22 22:29:44 compute-0 podman[221003]: 2026-01-22 22:29:44.318554155 +0000 UTC m=+0.069337552 container died 5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:29:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c-userdata-shm.mount: Deactivated successfully.
Jan 22 22:29:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-5a7e496fbc69eac12ecc12952adec6a37f91062b55876a3836d609ba0b4e7cd8-merged.mount: Deactivated successfully.
Jan 22 22:29:44 compute-0 podman[221003]: 2026-01-22 22:29:44.377252566 +0000 UTC m=+0.128035953 container cleanup 5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:29:44 compute-0 systemd[1]: libpod-conmon-5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c.scope: Deactivated successfully.
Jan 22 22:29:44 compute-0 podman[221028]: 2026-01-22 22:29:44.440270116 +0000 UTC m=+0.079789457 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible)
Jan 22 22:29:44 compute-0 podman[221051]: 2026-01-22 22:29:44.471305165 +0000 UTC m=+0.060269302 container remove 5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:29:44 compute-0 podman[221019]: 2026-01-22 22:29:44.479352559 +0000 UTC m=+0.114890179 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.478 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[19c45e1f-4a22-41e2-9fc6-c4f98a01ac75]: (4, ('Thu Jan 22 10:29:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9 (5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c)\n5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c\nThu Jan 22 10:29:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9 (5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c)\n5375c85135bddaf3e14d0701f496f7e39fec2a401665ef060d45de4794e3a67c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.483 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7b70c34d-a4c7-49cb-bea7-2cf610296fa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.484 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9638a29e-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.485 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 kernel: tap9638a29e-60: left promiscuous mode
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.497 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.500 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[118997d1-484c-444f-b632-a55f65f230e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.516 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9232d9-b289-48fa-bdb4-976daa30f6db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.517 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f522a05c-ba29-4fb7-b7be-e699e339c5fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.536 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[466c0475-9faa-4c2e-8f7b-478dd6a704bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461757, 'reachable_time': 39619, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221093, 'error': None, 'target': 'ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.540 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9638a29e-60d5-4398-ba9b-5875dd746da9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:29:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d9638a29e\x2d60d5\x2d4398\x2dba9b\x2d5875dd746da9.mount: Deactivated successfully.
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.541 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[faaa2941-b544-4c8e-9115-0f2f1b55bee0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.542 104215 INFO neutron.agent.ovn.metadata.agent [-] Port bb779c0b-7bb9-4554-bb17-a824e418cd3e in datapath 3c6383b5-4f25-45b9-ae16-88694b1c61a8 unbound from our chassis
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.544 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c6383b5-4f25-45b9-ae16-88694b1c61a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.545 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f94b7be0-778d-4841-8eb4-d8d81668b22b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.546 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8 namespace which is not needed anymore
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [NOTICE]   (220803) : haproxy version is 2.8.14-c23fe91
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [NOTICE]   (220803) : path to executable is /usr/sbin/haproxy
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [WARNING]  (220803) : Exiting Master process...
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [WARNING]  (220803) : Exiting Master process...
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [ALERT]    (220803) : Current worker (220805) exited with code 143 (Terminated)
Jan 22 22:29:44 compute-0 neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8[220799]: [WARNING]  (220803) : All workers exited. Exiting... (0)
Jan 22 22:29:44 compute-0 systemd[1]: libpod-5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8.scope: Deactivated successfully.
Jan 22 22:29:44 compute-0 conmon[220799]: conmon 5a5b897fc9c81ffb33c0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8.scope/container/memory.events
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.684 182729 DEBUG nova.compute.manager [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-unplugged-1785457a-7bc5-41e8-9339-52af1358fb85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.684 182729 DEBUG oslo_concurrency.lockutils [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.684 182729 DEBUG oslo_concurrency.lockutils [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.684 182729 DEBUG oslo_concurrency.lockutils [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.685 182729 DEBUG nova.compute.manager [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-unplugged-1785457a-7bc5-41e8-9339-52af1358fb85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.685 182729 DEBUG nova.compute.manager [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-unplugged-1785457a-7bc5-41e8-9339-52af1358fb85 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.685 182729 DEBUG nova.compute.manager [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.685 182729 DEBUG oslo_concurrency.lockutils [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.685 182729 DEBUG oslo_concurrency.lockutils [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.686 182729 DEBUG oslo_concurrency.lockutils [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.686 182729 DEBUG nova.compute.manager [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.686 182729 WARNING nova.compute.manager [req-30ba8d89-b43a-4d1c-a376-2304f4c4cd2a req-b62d18b3-b9fe-494f-8f5f-c05fbf009114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received unexpected event network-vif-plugged-1785457a-7bc5-41e8-9339-52af1358fb85 for instance with vm_state active and task_state deleting.
Jan 22 22:29:44 compute-0 podman[221111]: 2026-01-22 22:29:44.687099315 +0000 UTC m=+0.049590030 container died 5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:29:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8-userdata-shm.mount: Deactivated successfully.
Jan 22 22:29:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-9a840841dc747cd8f385912d50251127686d26eaccdd4eca3b3670eed95f11a3-merged.mount: Deactivated successfully.
Jan 22 22:29:44 compute-0 podman[221111]: 2026-01-22 22:29:44.727965103 +0000 UTC m=+0.090455788 container cleanup 5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 22:29:44 compute-0 systemd[1]: libpod-conmon-5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8.scope: Deactivated successfully.
Jan 22 22:29:44 compute-0 podman[221142]: 2026-01-22 22:29:44.797426327 +0000 UTC m=+0.046679656 container remove 5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.803 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e0fcac41-44e6-4320-a203-642f026bb27b]: (4, ('Thu Jan 22 10:29:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8 (5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8)\n5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8\nThu Jan 22 10:29:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8 (5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8)\n5a5b897fc9c81ffb33c08073ede1a7d802b0eafe79bb591fc27ef56ea8b484e8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.805 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b71beb62-7988-478a-8029-66e6a676231f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.805 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c6383b5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.808 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 kernel: tap3c6383b5-40: left promiscuous mode
Jan 22 22:29:44 compute-0 nova_compute[182725]: 2026-01-22 22:29:44.832 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.838 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2ba544-f467-4b68-86ee-e35878a9148d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.855 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[387bc5d0-8f13-43aa-9d12-c41ff0ca0c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.856 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[428dbde2-f75c-4b2e-a1be-e1cdf5fd695b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.877 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[93f85f62-9330-43d7-ab41-2246e7a67728]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461616, 'reachable_time': 42472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221157, 'error': None, 'target': 'ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.879 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c6383b5-4f25-45b9-ae16-88694b1c61a8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:29:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:44.879 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[33a31532-0556-4cde-ad66-5f44f6eb7821]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:29:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d3c6383b5\x2d4f25\x2d45b9\x2dae16\x2d88694b1c61a8.mount: Deactivated successfully.
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.157 182729 DEBUG nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-unplugged-9f0fc2de-e685-4869-b02e-2755f938de79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.158 182729 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.158 182729 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.158 182729 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.158 182729 DEBUG nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-unplugged-9f0fc2de-e685-4869-b02e-2755f938de79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.159 182729 DEBUG nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-unplugged-9f0fc2de-e685-4869-b02e-2755f938de79 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.159 182729 DEBUG nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.159 182729 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.159 182729 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.159 182729 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.159 182729 DEBUG nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.159 182729 WARNING nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received unexpected event network-vif-plugged-9f0fc2de-e685-4869-b02e-2755f938de79 for instance with vm_state active and task_state deleting.
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.847 182729 DEBUG nova.compute.manager [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-unplugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.848 182729 DEBUG oslo_concurrency.lockutils [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.848 182729 DEBUG oslo_concurrency.lockutils [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.848 182729 DEBUG oslo_concurrency.lockutils [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.848 182729 DEBUG nova.compute.manager [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-unplugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.848 182729 DEBUG nova.compute.manager [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-unplugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.849 182729 DEBUG nova.compute.manager [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.849 182729 DEBUG oslo_concurrency.lockutils [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.849 182729 DEBUG oslo_concurrency.lockutils [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.849 182729 DEBUG oslo_concurrency.lockutils [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.849 182729 DEBUG nova.compute.manager [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] No waiting events found dispatching network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:29:46 compute-0 nova_compute[182725]: 2026-01-22 22:29:46.849 182729 WARNING nova.compute.manager [req-d7185ed7-92f3-4050-a7ae-812abd57baa6 req-0b87dab1-b1c2-463b-9478-c950d39be26f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received unexpected event network-vif-plugged-bb779c0b-7bb9-4554-bb17-a824e418cd3e for instance with vm_state active and task_state deleting.
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.339 182729 DEBUG nova.compute.manager [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-deleted-1785457a-7bc5-41e8-9339-52af1358fb85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.340 182729 INFO nova.compute.manager [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Neutron deleted interface 1785457a-7bc5-41e8-9339-52af1358fb85; detaching it from the instance and deleting it from the info cache
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.340 182729 DEBUG nova.network.neutron [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updating instance_info_cache with network_info: [{"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "address": "fa:16:3e:7c:3e:28", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb779c0b-7b", "ovs_interfaceid": "bb779c0b-7bb9-4554-bb17-a824e418cd3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.381 182729 DEBUG nova.compute.manager [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Detach interface failed, port_id=1785457a-7bc5-41e8-9339-52af1358fb85, reason: Instance 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.382 182729 DEBUG nova.compute.manager [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-deleted-bb779c0b-7bb9-4554-bb17-a824e418cd3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.382 182729 INFO nova.compute.manager [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Neutron deleted interface bb779c0b-7bb9-4554-bb17-a824e418cd3e; detaching it from the instance and deleting it from the info cache
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.382 182729 DEBUG nova.network.neutron [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updating instance_info_cache with network_info: [{"id": "9f0fc2de-e685-4869-b02e-2755f938de79", "address": "fa:16:3e:ac:ae:3a", "network": {"id": "3c6383b5-4f25-45b9-ae16-88694b1c61a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-940108843", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6721d3fe421f42c6a38a2d2e9378217a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f0fc2de-e6", "ovs_interfaceid": "9f0fc2de-e685-4869-b02e-2755f938de79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.405 182729 DEBUG nova.compute.manager [req-423323a6-78d4-46a8-9fc9-becefd1cda94 req-17909caa-6302-43bd-80a1-09601948a519 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Detach interface failed, port_id=bb779c0b-7bb9-4554-bb17-a824e418cd3e, reason: Instance 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.415 182729 DEBUG nova.network.neutron [-] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.432 182729 INFO nova.compute.manager [-] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Took 4.13 seconds to deallocate network for instance.
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.505 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.506 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.569 182729 DEBUG nova.compute.provider_tree [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.584 182729 DEBUG nova.scheduler.client.report [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.612 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.635 182729 INFO nova.scheduler.client.report [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Deleted allocations for instance 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.722 182729 DEBUG oslo_concurrency.lockutils [None req-83b7d78c-4879-4332-9a23-94516f43a244 3d68435ec92a4e0e900bbd275c277a15 6721d3fe421f42c6a38a2d2e9378217a - - default default] Lock "87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:29:48 compute-0 nova_compute[182725]: 2026-01-22 22:29:48.874 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:49 compute-0 nova_compute[182725]: 2026-01-22 22:29:49.208 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:50 compute-0 nova_compute[182725]: 2026-01-22 22:29:50.462 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:50.462 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:29:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:50.464 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:29:50 compute-0 nova_compute[182725]: 2026-01-22 22:29:50.525 182729 DEBUG nova.compute.manager [req-c8522bea-bbdb-4815-ba3b-e014c75cd448 req-6c5c10ad-1022-4b7f-a8b9-c397cf0e0555 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Received event network-vif-deleted-9f0fc2de-e685-4869-b02e-2755f938de79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:29:51 compute-0 ovn_controller[94850]: 2026-01-22T22:29:51Z|00237|binding|INFO|Releasing lport 57be3db3-de80-45e5-a479-c3b4b4920475 from this chassis (sb_readonly=0)
Jan 22 22:29:51 compute-0 nova_compute[182725]: 2026-01-22 22:29:51.145 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:53 compute-0 nova_compute[182725]: 2026-01-22 22:29:53.876 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:54 compute-0 nova_compute[182725]: 2026-01-22 22:29:54.210 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:55 compute-0 podman[221159]: 2026-01-22 22:29:55.13787583 +0000 UTC m=+0.060359334 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:29:55 compute-0 podman[221158]: 2026-01-22 22:29:55.148261473 +0000 UTC m=+0.066425108 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 22:29:55 compute-0 ovn_controller[94850]: 2026-01-22T22:29:55Z|00238|binding|INFO|Releasing lport 57be3db3-de80-45e5-a479-c3b4b4920475 from this chassis (sb_readonly=0)
Jan 22 22:29:55 compute-0 nova_compute[182725]: 2026-01-22 22:29:55.740 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:29:57.467 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:29:58 compute-0 nova_compute[182725]: 2026-01-22 22:29:58.878 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:29:59 compute-0 nova_compute[182725]: 2026-01-22 22:29:59.139 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120984.1382277, 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:29:59 compute-0 nova_compute[182725]: 2026-01-22 22:29:59.139 182729 INFO nova.compute.manager [-] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] VM Stopped (Lifecycle Event)
Jan 22 22:29:59 compute-0 podman[221214]: 2026-01-22 22:29:59.147674445 +0000 UTC m=+0.077974641 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:29:59 compute-0 nova_compute[182725]: 2026-01-22 22:29:59.161 182729 DEBUG nova.compute.manager [None req-27c90ded-6d21-4073-9595-d5899a612d28 - - - - - -] [instance: 87ba655a-78f5-4a2b-a863-0a2cf6ca8dd6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:29:59 compute-0 nova_compute[182725]: 2026-01-22 22:29:59.212 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:03 compute-0 nova_compute[182725]: 2026-01-22 22:30:03.880 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:04 compute-0 nova_compute[182725]: 2026-01-22 22:30:04.214 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.644 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.713 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.714 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.743 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.848 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.848 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.855 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:30:07 compute-0 nova_compute[182725]: 2026-01-22 22:30:07.855 182729 INFO nova.compute.claims [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.024 182729 DEBUG nova.compute.provider_tree [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.038 182729 DEBUG nova.scheduler.client.report [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.063 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.063 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.162 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.163 182729 DEBUG nova.network.neutron [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.190 182729 INFO nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.207 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.331 182729 DEBUG nova.policy [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.336 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.337 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.338 182729 INFO nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Creating image(s)
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.338 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "/var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.339 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "/var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.340 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "/var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.355 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.419 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.421 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.422 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.438 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.501 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.502 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.553 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.555 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.555 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.614 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.615 182729 DEBUG nova.virt.disk.api [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Checking if we can resize image /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.616 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.676 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.678 182729 DEBUG nova.virt.disk.api [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Cannot resize image /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.678 182729 DEBUG nova.objects.instance [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lazy-loading 'migration_context' on Instance uuid edb59ec0-c6f0-4757-b5cc-293686870779 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.741 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.741 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Ensure instance console log exists: /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.742 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.742 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.742 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:08 compute-0 nova_compute[182725]: 2026-01-22 22:30:08.882 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:09 compute-0 nova_compute[182725]: 2026-01-22 22:30:09.005 182729 DEBUG nova.network.neutron [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Successfully created port: 74ec03d6-0608-4d5c-9426-b7bf6c291c36 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.111 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'name': 'tempest-ServerActionsTestOtherA-server-75282839', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000048', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '825c15e60ddd4efeb69accacdb4b129b', 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'hostId': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.113 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.113 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>]
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.132 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.133 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7959cafe-d0c9-4f6c-8968-d530b4d74ef8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.114066', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8a5380e-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.771330347, 'message_signature': 'c1747c0d49677b7646b9665621ef84ca8997492919095caebeee9ac2f74d5d22'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.114066', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8a54a56-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.771330347, 'message_signature': '7dab2b2e51846d63b30741d8f568939170f0b55e3c4180ca653e98c97a5105ad'}]}, 'timestamp': '2026-01-22 22:30:09.133588', '_unique_id': 'f4bcf34ca68f47a98abed2a2e701e8ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.135 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.137 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>]
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.137 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.157 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/cpu volume: 11970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88fde47a-ae1e-46e9-a947-da6c4da3e95e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11970000000, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'timestamp': '2026-01-22T22:30:09.137660', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e8a8fa16-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.814218847, 'message_signature': 'a992d1345b569be30b40b7df715142404fa6059b30426fd68960996195532301'}]}, 'timestamp': '2026-01-22 22:30:09.158017', '_unique_id': '6909a0c351904ef189cab238de9779ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.159 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.162 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.163 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b43d4582-ff78-46ad-b30c-344049349f4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.162685', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8a9d990-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.771330347, 'message_signature': '418b6e050e2b0c6bc61e92d013fbf823a9820b8355a36a60e0c0e32b1bd5df4c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.162685', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8a9f2fe-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.771330347, 'message_signature': '65713f000351cb3122b2f35d90cf2c44d48dd54880d5232e0f51c4022739b166'}]}, 'timestamp': '2026-01-22 22:30:09.164256', '_unique_id': 'b6035e133f77480e8ccbf208293cba68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.166 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.167 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.167 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.168 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>]
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.172 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fcd5dfc9-aa45-42d6-96d8-739f7eb5504a / tap44997b83-45 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.172 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f520298-d283-4299-be2f-8e0d7ca55783', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.168560', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8ab4992-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': 'a791f3d4cf56349c339af0c002953605af2e1e4ddf34c2ebc8ab9ddb44c88bef'}]}, 'timestamp': '2026-01-22 22:30:09.173020', '_unique_id': '69e18a973e6345f0ba3d15c4fd481ce2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.174 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.175 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.175 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-75282839>]
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.176 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.176 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '979b5c4e-97c3-4980-b1df-590283dfc0ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.176249', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8abdc90-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.771330347, 'message_signature': '6da92ecd6d55c8638432b7e1f669447b6a516ae3839505f6a31dab93603185ca'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.176249', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8abf00e-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.771330347, 'message_signature': '92c5496c30f952c1262a118a031c729857926d5963abcb5b50261966100c4de4'}]}, 'timestamp': '2026-01-22 22:30:09.177196', '_unique_id': '61900bc845e24fe08baedfd4795fd7cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.180 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12ec9557-0fb0-4b6c-9b77-e09211a95cdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.179950', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8ac711e-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '35f9daae7ecd180c8c6c9fefc827b2764b1a9f686c014e926c881f3b7a21ffb5'}]}, 'timestamp': '2026-01-22 22:30:09.180621', '_unique_id': '91eaae73337444159467a5a1b4ba69d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.182 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.183 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.183 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/memory.usage volume: 42.8984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5af3419-7a10-43a7-9f35-86e756b62cbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.8984375, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'timestamp': '2026-01-22T22:30:09.183454', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e8acf83c-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.814218847, 'message_signature': '1ee4462b46bbd18d7f6043f6d19a0edf3a611a574fd3bf282990268f1ded5cd6'}]}, 'timestamp': '2026-01-22 22:30:09.184006', '_unique_id': 'c8c45d412b314be29149b93247ed70af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.186 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab8e257a-efb2-4b47-b479-fcf57b5af755', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.186605', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8ad758c-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '8e9006bc259ff6e99369ba0a158659c729ebd31387b7b65e6f395e70633d3be0'}]}, 'timestamp': '2026-01-22 22:30:09.187199', '_unique_id': '55b3fb24a60e44ef9814ab1006d8e5a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.188 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.189 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4de7bc01-847a-4e66-989d-f91fdd7f32ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.189575', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8ade774-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '210d5923fbe0f79a2b43ff2c227387c66236de9870556633381c8e1b748b0cd1'}]}, 'timestamp': '2026-01-22 22:30:09.190110', '_unique_id': '23f4e4078ce64077b2a5d79b300d45c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.191 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:30:09 compute-0 nova_compute[182725]: 2026-01-22 22:30:09.218 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.241 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.write.latency volume: 12257412543 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.242 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12128421-bc63-42d0-a4a3-87ff6f062410', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12257412543, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.192432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8b5e3b6-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': 'ed8a642eb6b2d82ad0924acee2ae9f7ea3b9015f9adf8e6aa1d596f310bf2a3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.192432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8b5fb62-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': 'dff588601a1c3c74e7c2c9ee3af455d8d2eaea1eb38117afdafee41b6b5d31c9'}]}, 'timestamp': '2026-01-22 22:30:09.243078', '_unique_id': '8c6169f55dad4f11a2c6e259f9bccb4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.244 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.246 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6b0517a-de79-4484-88d1-77ba619630d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.246719', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8b6a094-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '65b4dd8a76afb164f8a3f81381194ba6a162d93fc66693af2d2e07a361d62481'}]}, 'timestamp': '2026-01-22 22:30:09.247290', '_unique_id': '94b37cdb6fc346eab5f6286a3cc855d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.248 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.249 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.read.bytes volume: 30185984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.250 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '984a1fb9-f5f1-451c-9cdd-b61bbbea50bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30185984, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.249704', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8b7140c-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '900cf74ce1d1d3561feed758fcc157ba79684e6cf1debb917080645fc6afc685'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.249704', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8b72726-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '851390db81c43e63c406a6f0f3c4bdd9aa85e265b4c55e987441d542093cb731'}]}, 'timestamp': '2026-01-22 22:30:09.250698', '_unique_id': 'efc5fbfd1460469187769f4369b58d41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.251 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.253 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.write.bytes volume: 73068544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.253 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b75a8f56-a72f-4805-bd85-e0ad930b5ce2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73068544, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.253116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8b797a6-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '6d30e137164d5d9a7509ea91152d35de6b5dc50d01732bc0436d683109e77f6a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.253116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8b7a9bc-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': 'e8895208f4b09bec498b36d7aaca64b2355f58738ce716d9ece4817165744520'}]}, 'timestamp': '2026-01-22 22:30:09.254039', '_unique_id': '91cca389bf644b759d6848d67a61a33e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.255 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.256 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9aa1888-6a51-4d78-baa7-89af891962ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.256455', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8b81a64-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '4a0de953755397484298d2c97692bcb08cb2094f843f4a4499d6dd5f49dc8a10'}]}, 'timestamp': '2026-01-22 22:30:09.256996', '_unique_id': 'f583dd4bed6d4d2ba99f5a2b0e28f1f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.259 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.259 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.260 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fca11945-a1a0-47f3-8478-a521290d7cb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.259343', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8b88ada-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': 'ecec0d136955e2cbcbf13d12d25d530765ff2dacd5b95107643b0414286eb6ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.259343', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8b8acd6-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '458b7d77d0aace3f9af82267faef7de2aba75c57da280caa323485a90899fe53'}]}, 'timestamp': '2026-01-22 22:30:09.260763', '_unique_id': '3806e498bbef457f93da2e9f86c06af8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.261 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.263 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b1e8647-7269-4b41-95b8-aa361388d4db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.263298', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8b925c6-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '552cdd22643eb0044857057c431d03659332c2be63a8d004cfa04e87cfeefeae'}]}, 'timestamp': '2026-01-22 22:30:09.263836', '_unique_id': 'd25cbf53d2674bdb850743b4575e67a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.264 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.266 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.266 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd705ddea-ee50-44fd-be96-e193a44c246a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.266213', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8b99786-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '61e82816cd851c604b84e5179f660bba4922d59e7d21d092d0e7066019522a43'}]}, 'timestamp': '2026-01-22 22:30:09.266710', '_unique_id': '235aa469615d4aacb515a20ef60ab567'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.267 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.268 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.269 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.read.latency volume: 349312336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.269 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.read.latency volume: 37067570 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4f64ff4-a473-4b89-8339-d72eaf5a2071', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 349312336, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.269114', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8ba089c-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '273a1434abe90764bf1905aa8408846943f2d9ad0c01b1950d2d69d2d0092e6a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37067570, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.269114', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8ba1ada-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '4b40da8927558cf327cad78532d64b2a986ad70b96a0c7dc6f7d6461aa14889b'}]}, 'timestamp': '2026-01-22 22:30:09.270051', '_unique_id': '51b04ef56adb4cf9b04180e916f00637'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.271 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.272 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.272 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1c63244-9c43-4788-8c54-d50b65cba2cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.272571', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8ba9334-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': '8ab3e7bff9fc8cd1092af0896ca75614381556fc70826ef4e6184e3a005e26d5'}]}, 'timestamp': '2026-01-22 22:30:09.273162', '_unique_id': '2ebf0864f93842a2848585dc829862ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.274 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.275 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.275 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.276 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd47c0f24-df33-4c5e-9c02-97e6c9da0e08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-vda', 'timestamp': '2026-01-22T22:30:09.275847', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e8bb100c-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '50c27d3ae57d2b86a3d2bdd0dcdc4e47e407cff93c19fdbb038f62a2cc8e5c4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-sda', 'timestamp': '2026-01-22T22:30:09.275847', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'instance-00000048', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e8bb24f2-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.849753219, 'message_signature': '457305f4b52b0f70fd75b0392bd6741ad3b549d723dd5ec249cd61d5897caee9'}]}, 'timestamp': '2026-01-22 22:30:09.276971', '_unique_id': '797ab03774ca47a78106ee28840b3ac2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.278 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.279 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.279 12 DEBUG ceilometer.compute.pollsters [-] fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b157815-5a12-41aa-9ca9-618a6c95b035', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fc3e5f9d1ee84e48a089c2636d28a7b0', 'user_name': None, 'project_id': '825c15e60ddd4efeb69accacdb4b129b', 'project_name': None, 'resource_id': 'instance-00000048-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-tap44997b83-45', 'timestamp': '2026-01-22T22:30:09.279314', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-75282839', 'name': 'tap44997b83-45', 'instance_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'instance_type': 'm1.nano', 'host': '7ae863fa1353b29f2ef1f9c7c005c228d1a1730b9af91a34bc756c2c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:7f:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44997b83-45'}, 'message_id': 'e8bb94c8-f7e1-11f0-9a35-fa163e3d8874', 'monotonic_time': 4646.825909774, 'message_signature': 'ff3943e7e1af453e8711f8cbae09b9cafec4dfd41439f2ca465b057071196ec4'}]}, 'timestamp': '2026-01-22 22:30:09.279664', '_unique_id': 'be5def1da1b44c1a8ce6dddd909fd055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:30:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:30:09.280 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.340 182729 DEBUG nova.network.neutron [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Successfully updated port: 74ec03d6-0608-4d5c-9426-b7bf6c291c36 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.357 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.358 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquired lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.358 182729 DEBUG nova.network.neutron [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.489 182729 DEBUG nova.compute.manager [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-changed-74ec03d6-0608-4d5c-9426-b7bf6c291c36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.490 182729 DEBUG nova.compute.manager [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Refreshing instance network info cache due to event network-changed-74ec03d6-0608-4d5c-9426-b7bf6c291c36. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.490 182729 DEBUG oslo_concurrency.lockutils [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:10 compute-0 nova_compute[182725]: 2026-01-22 22:30:10.569 182729 DEBUG nova.network.neutron [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:30:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:12.437 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:12.438 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:12.438 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.487 182729 DEBUG nova.network.neutron [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Updating instance_info_cache with network_info: [{"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.512 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Releasing lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.512 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Instance network_info: |[{"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.513 182729 DEBUG oslo_concurrency.lockutils [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.513 182729 DEBUG nova.network.neutron [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Refreshing network info cache for port 74ec03d6-0608-4d5c-9426-b7bf6c291c36 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.516 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Start _get_guest_xml network_info=[{"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.523 182729 WARNING nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.531 182729 DEBUG nova.virt.libvirt.host [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.532 182729 DEBUG nova.virt.libvirt.host [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.535 182729 DEBUG nova.virt.libvirt.host [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.536 182729 DEBUG nova.virt.libvirt.host [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.537 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.538 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.538 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.538 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.538 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.538 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.539 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.539 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.539 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.539 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.539 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.540 182729 DEBUG nova.virt.hardware [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.543 182729 DEBUG nova.virt.libvirt.vif [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-141992064',display_name='tempest-ServerActionsTestOtherA-server-141992064',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-141992064',id=80,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='825c15e60ddd4efeb69accacdb4b129b',ramdisk_id='',reservation_id='r-ukr27gl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-658780637',owner_user_name='tempest-ServerActionsTestOtherA-658780637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:08Z,user_data=None,user_id='fc3e5f9d1ee84e48a089c2636d28a7b0',uuid=edb59ec0-c6f0-4757-b5cc-293686870779,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.543 182729 DEBUG nova.network.os_vif_util [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converting VIF {"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.544 182729 DEBUG nova.network.os_vif_util [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:8a:22,bridge_name='br-int',has_traffic_filtering=True,id=74ec03d6-0608-4d5c-9426-b7bf6c291c36,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74ec03d6-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.544 182729 DEBUG nova.objects.instance [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lazy-loading 'pci_devices' on Instance uuid edb59ec0-c6f0-4757-b5cc-293686870779 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.558 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <uuid>edb59ec0-c6f0-4757-b5cc-293686870779</uuid>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <name>instance-00000050</name>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestOtherA-server-141992064</nova:name>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:30:12</nova:creationTime>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:user uuid="fc3e5f9d1ee84e48a089c2636d28a7b0">tempest-ServerActionsTestOtherA-658780637-project-member</nova:user>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:project uuid="825c15e60ddd4efeb69accacdb4b129b">tempest-ServerActionsTestOtherA-658780637</nova:project>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         <nova:port uuid="74ec03d6-0608-4d5c-9426-b7bf6c291c36">
Jan 22 22:30:12 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <system>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <entry name="serial">edb59ec0-c6f0-4757-b5cc-293686870779</entry>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <entry name="uuid">edb59ec0-c6f0-4757-b5cc-293686870779</entry>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </system>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <os>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   </os>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <features>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   </features>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk.config"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:76:8a:22"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <target dev="tap74ec03d6-06"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/console.log" append="off"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <video>
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </video>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:30:12 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:30:12 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:30:12 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:30:12 compute-0 nova_compute[182725]: </domain>
Jan 22 22:30:12 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.559 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Preparing to wait for external event network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.559 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.560 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.560 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.560 182729 DEBUG nova.virt.libvirt.vif [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-141992064',display_name='tempest-ServerActionsTestOtherA-server-141992064',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-141992064',id=80,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='825c15e60ddd4efeb69accacdb4b129b',ramdisk_id='',reservation_id='r-ukr27gl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-658780637',owner_user_name='tempest-ServerActionsTestOtherA-658780637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:08Z,user_data=None,user_id='fc3e5f9d1ee84e48a089c2636d28a7b0',uuid=edb59ec0-c6f0-4757-b5cc-293686870779,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.561 182729 DEBUG nova.network.os_vif_util [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converting VIF {"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.561 182729 DEBUG nova.network.os_vif_util [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:8a:22,bridge_name='br-int',has_traffic_filtering=True,id=74ec03d6-0608-4d5c-9426-b7bf6c291c36,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74ec03d6-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.562 182729 DEBUG os_vif [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:8a:22,bridge_name='br-int',has_traffic_filtering=True,id=74ec03d6-0608-4d5c-9426-b7bf6c291c36,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74ec03d6-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.562 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.563 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.563 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.568 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.568 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ec03d6-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.569 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74ec03d6-06, col_values=(('external_ids', {'iface-id': '74ec03d6-0608-4d5c-9426-b7bf6c291c36', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:8a:22', 'vm-uuid': 'edb59ec0-c6f0-4757-b5cc-293686870779'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.570 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.572 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:30:12 compute-0 NetworkManager[54954]: <info>  [1769121012.5724] manager: (tap74ec03d6-06): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.580 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.581 182729 INFO os_vif [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:8a:22,bridge_name='br-int',has_traffic_filtering=True,id=74ec03d6-0608-4d5c-9426-b7bf6c291c36,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74ec03d6-06')
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.663 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.663 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.663 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] No VIF found with MAC fa:16:3e:76:8a:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.664 182729 INFO nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Using config drive
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.842 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.865 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Triggering sync for uuid fcd5dfc9-aa45-42d6-96d8-739f7eb5504a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.866 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Triggering sync for uuid edb59ec0-c6f0-4757-b5cc-293686870779 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.867 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.867 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.868 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:12 compute-0 nova_compute[182725]: 2026-01-22 22:30:12.897 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.202 182729 INFO nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Creating config drive at /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk.config
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.211 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3phypko2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.357 182729 DEBUG oslo_concurrency.processutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3phypko2" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:13 compute-0 kernel: tap74ec03d6-06: entered promiscuous mode
Jan 22 22:30:13 compute-0 NetworkManager[54954]: <info>  [1769121013.4617] manager: (tap74ec03d6-06): new Tun device (/org/freedesktop/NetworkManager/Devices/123)
Jan 22 22:30:13 compute-0 ovn_controller[94850]: 2026-01-22T22:30:13Z|00239|binding|INFO|Claiming lport 74ec03d6-0608-4d5c-9426-b7bf6c291c36 for this chassis.
Jan 22 22:30:13 compute-0 ovn_controller[94850]: 2026-01-22T22:30:13Z|00240|binding|INFO|74ec03d6-0608-4d5c-9426-b7bf6c291c36: Claiming fa:16:3e:76:8a:22 10.100.0.5
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.463 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.474 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:8a:22 10.100.0.5'], port_security=['fa:16:3e:76:8a:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'edb59ec0-c6f0-4757-b5cc-293686870779', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a544701e-2e05-4802-ba07-c012963707f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '825c15e60ddd4efeb69accacdb4b129b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '76964364-92f8-48bf-9763-4bfae711ee59', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70afdb24-37c1-41f2-9284-84cfdd4b7137, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=74ec03d6-0608-4d5c-9426-b7bf6c291c36) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.476 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 74ec03d6-0608-4d5c-9426-b7bf6c291c36 in datapath a544701e-2e05-4802-ba07-c012963707f2 bound to our chassis
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.482 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a544701e-2e05-4802-ba07-c012963707f2
Jan 22 22:30:13 compute-0 ovn_controller[94850]: 2026-01-22T22:30:13Z|00241|binding|INFO|Setting lport 74ec03d6-0608-4d5c-9426-b7bf6c291c36 ovn-installed in OVS
Jan 22 22:30:13 compute-0 ovn_controller[94850]: 2026-01-22T22:30:13Z|00242|binding|INFO|Setting lport 74ec03d6-0608-4d5c-9426-b7bf6c291c36 up in Southbound
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.495 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.504 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cac9e85b-9a94-423f-8eeb-17af656b3081]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:13 compute-0 systemd-udevd[221294]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:30:13 compute-0 systemd-machined[154006]: New machine qemu-31-instance-00000050.
Jan 22 22:30:13 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-00000050.
Jan 22 22:30:13 compute-0 NetworkManager[54954]: <info>  [1769121013.5503] device (tap74ec03d6-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:30:13 compute-0 NetworkManager[54954]: <info>  [1769121013.5509] device (tap74ec03d6-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.553 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[aada74be-83d2-4f59-a6ef-ae5d966c6bc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:13 compute-0 podman[221268]: 2026-01-22 22:30:13.556245654 +0000 UTC m=+0.107272596 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.558 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b64df71e-3bb9-4d2f-b6bd-f7e1608ddffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.597 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[2b87706f-6e95-4c00-82dc-d63f9c7e1ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.623 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a43457-1031-4394-835d-bdd6bbac934a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa544701e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:37:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457642, 'reachable_time': 34940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221306, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.648 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9f791514-a7d2-4ea8-b095-1676ae6d8bf6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa544701e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457661, 'tstamp': 457661}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221309, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa544701e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457665, 'tstamp': 457665}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221309, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.650 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa544701e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.652 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.654 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.654 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa544701e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.654 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.654 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa544701e-20, col_values=(('external_ids', {'iface-id': '57be3db3-de80-45e5-a479-c3b4b4920475'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:13.655 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.885 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:13 compute-0 nova_compute[182725]: 2026-01-22 22:30:13.910 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.056 182729 DEBUG nova.network.neutron [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Updated VIF entry in instance network info cache for port 74ec03d6-0608-4d5c-9426-b7bf6c291c36. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.056 182729 DEBUG nova.network.neutron [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Updating instance_info_cache with network_info: [{"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.073 182729 DEBUG oslo_concurrency.lockutils [req-6b5a3c84-0393-4c6e-8e7a-752546f28a78 req-f29e9af7-0bbe-444d-a7d7-17548bfa5a45 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.225 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121014.2244902, edb59ec0-c6f0-4757-b5cc-293686870779 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.225 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] VM Started (Lifecycle Event)
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.258 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.265 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121014.2293339, edb59ec0-c6f0-4757-b5cc-293686870779 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.265 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] VM Paused (Lifecycle Event)
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.285 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.289 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.306 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.862 182729 DEBUG nova.compute.manager [req-6c81c0fa-0d0e-441b-9598-d4a2eb082e01 req-ee667db3-c648-48a2-8f14-29ebe4c0c42c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.865 182729 DEBUG oslo_concurrency.lockutils [req-6c81c0fa-0d0e-441b-9598-d4a2eb082e01 req-ee667db3-c648-48a2-8f14-29ebe4c0c42c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.866 182729 DEBUG oslo_concurrency.lockutils [req-6c81c0fa-0d0e-441b-9598-d4a2eb082e01 req-ee667db3-c648-48a2-8f14-29ebe4c0c42c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.866 182729 DEBUG oslo_concurrency.lockutils [req-6c81c0fa-0d0e-441b-9598-d4a2eb082e01 req-ee667db3-c648-48a2-8f14-29ebe4c0c42c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.867 182729 DEBUG nova.compute.manager [req-6c81c0fa-0d0e-441b-9598-d4a2eb082e01 req-ee667db3-c648-48a2-8f14-29ebe4c0c42c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Processing event network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.868 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.873 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121014.8728786, edb59ec0-c6f0-4757-b5cc-293686870779 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.873 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] VM Resumed (Lifecycle Event)
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.877 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.881 182729 INFO nova.virt.libvirt.driver [-] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Instance spawned successfully.
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.882 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.900 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.912 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.920 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.921 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.922 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.923 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.923 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.924 182729 DEBUG nova.virt.libvirt.driver [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.937 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.938 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.938 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.938 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:30:14 compute-0 nova_compute[182725]: 2026-01-22 22:30:14.952 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.005 182729 INFO nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Took 6.67 seconds to spawn the instance on the hypervisor.
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.006 182729 DEBUG nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.049 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:15 compute-0 podman[221320]: 2026-01-22 22:30:15.096965043 +0000 UTC m=+0.092059529 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.105 182729 INFO nova.compute.manager [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Took 7.30 seconds to build instance.
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.120 182729 DEBUG oslo_concurrency.lockutils [None req-26249ca4-0a93-4ceb-8435-add55699e814 fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.121 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "edb59ec0-c6f0-4757-b5cc-293686870779" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.121 182729 INFO nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.121 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "edb59ec0-c6f0-4757-b5cc-293686870779" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.139 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.141 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:15 compute-0 podman[221319]: 2026-01-22 22:30:15.142828168 +0000 UTC m=+0.135443641 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.224 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.232 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.294 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.295 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.359 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.606 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.608 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5514MB free_disk=73.34689712524414GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.608 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.608 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.774 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance fcd5dfc9-aa45-42d6-96d8-739f7eb5504a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.774 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance edb59ec0-c6f0-4757-b5cc-293686870779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.774 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.774 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.855 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.869 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.891 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:30:15 compute-0 nova_compute[182725]: 2026-01-22 22:30:15.892 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:16 compute-0 nova_compute[182725]: 2026-01-22 22:30:16.474 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:16 compute-0 nova_compute[182725]: 2026-01-22 22:30:16.982 182729 DEBUG nova.compute.manager [req-b33521b4-94ab-44cc-add1-c66af42477db req-e032706b-b6e6-409f-a9ce-39f227492340 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:16 compute-0 nova_compute[182725]: 2026-01-22 22:30:16.983 182729 DEBUG oslo_concurrency.lockutils [req-b33521b4-94ab-44cc-add1-c66af42477db req-e032706b-b6e6-409f-a9ce-39f227492340 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:16 compute-0 nova_compute[182725]: 2026-01-22 22:30:16.983 182729 DEBUG oslo_concurrency.lockutils [req-b33521b4-94ab-44cc-add1-c66af42477db req-e032706b-b6e6-409f-a9ce-39f227492340 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:16 compute-0 nova_compute[182725]: 2026-01-22 22:30:16.984 182729 DEBUG oslo_concurrency.lockutils [req-b33521b4-94ab-44cc-add1-c66af42477db req-e032706b-b6e6-409f-a9ce-39f227492340 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:16 compute-0 nova_compute[182725]: 2026-01-22 22:30:16.984 182729 DEBUG nova.compute.manager [req-b33521b4-94ab-44cc-add1-c66af42477db req-e032706b-b6e6-409f-a9ce-39f227492340 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] No waiting events found dispatching network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:16 compute-0 nova_compute[182725]: 2026-01-22 22:30:16.984 182729 WARNING nova.compute.manager [req-b33521b4-94ab-44cc-add1-c66af42477db req-e032706b-b6e6-409f-a9ce-39f227492340 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received unexpected event network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 for instance with vm_state active and task_state None.
Jan 22 22:30:17 compute-0 nova_compute[182725]: 2026-01-22 22:30:17.572 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.572 182729 DEBUG nova.compute.manager [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-changed-74ec03d6-0608-4d5c-9426-b7bf6c291c36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.572 182729 DEBUG nova.compute.manager [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Refreshing instance network info cache due to event network-changed-74ec03d6-0608-4d5c-9426-b7bf6c291c36. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.573 182729 DEBUG oslo_concurrency.lockutils [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.574 182729 DEBUG oslo_concurrency.lockutils [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.574 182729 DEBUG nova.network.neutron [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Refreshing network info cache for port 74ec03d6-0608-4d5c-9426-b7bf6c291c36 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.889 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.892 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.893 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:30:18 compute-0 nova_compute[182725]: 2026-01-22 22:30:18.893 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:30:19 compute-0 nova_compute[182725]: 2026-01-22 22:30:19.349 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:19 compute-0 nova_compute[182725]: 2026-01-22 22:30:19.349 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:19 compute-0 nova_compute[182725]: 2026-01-22 22:30:19.350 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:30:19 compute-0 nova_compute[182725]: 2026-01-22 22:30:19.350 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fcd5dfc9-aa45-42d6-96d8-739f7eb5504a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:20 compute-0 nova_compute[182725]: 2026-01-22 22:30:20.273 182729 DEBUG nova.network.neutron [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Updated VIF entry in instance network info cache for port 74ec03d6-0608-4d5c-9426-b7bf6c291c36. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:30:20 compute-0 nova_compute[182725]: 2026-01-22 22:30:20.274 182729 DEBUG nova.network.neutron [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Updating instance_info_cache with network_info: [{"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:20 compute-0 nova_compute[182725]: 2026-01-22 22:30:20.301 182729 DEBUG oslo_concurrency.lockutils [req-9863d2ff-e02a-4c61-b071-c1f3af8aebe1 req-d63bb011-e691-4cba-aa99-32474f46c917 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-edb59ec0-c6f0-4757-b5cc-293686870779" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:20 compute-0 nova_compute[182725]: 2026-01-22 22:30:20.842 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:20 compute-0 nova_compute[182725]: 2026-01-22 22:30:20.843 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:20 compute-0 nova_compute[182725]: 2026-01-22 22:30:20.865 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.013 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.014 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.028 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.029 182729 INFO nova.compute.claims [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.071 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updating instance_info_cache with network_info: [{"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.106 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.106 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.107 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.107 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.240 182729 DEBUG nova.compute.provider_tree [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.254 182729 DEBUG nova.scheduler.client.report [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.274 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.275 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.351 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.352 182729 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.409 182729 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.433 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.649 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.650 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.651 182729 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Creating image(s)
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.651 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "/var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.652 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "/var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.653 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "/var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.666 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.766 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.768 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.769 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.786 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.860 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.862 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.896 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.898 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.923 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk 1073741824" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.925 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:21 compute-0 nova_compute[182725]: 2026-01-22 22:30:21.926 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.014 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.016 182729 DEBUG nova.virt.disk.api [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Checking if we can resize image /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.017 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.077 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.079 182729 DEBUG nova.virt.disk.api [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Cannot resize image /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.080 182729 DEBUG nova.objects.instance [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lazy-loading 'migration_context' on Instance uuid 9f290c4e-3649-4826-a9bd-1a7a4a6b7539 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.151 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.153 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Ensure instance console log exists: /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.154 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.154 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.155 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.318 182729 DEBUG nova.policy [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '24157ae704064825a4f59adf1d187391', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.575 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:22 compute-0 nova_compute[182725]: 2026-01-22 22:30:22.892 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:23 compute-0 nova_compute[182725]: 2026-01-22 22:30:23.892 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.829 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.830 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.831 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.832 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.832 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.853 182729 INFO nova.compute.manager [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Terminating instance
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.868 182729 DEBUG nova.compute.manager [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:30:24 compute-0 kernel: tap74ec03d6-06 (unregistering): left promiscuous mode
Jan 22 22:30:24 compute-0 NetworkManager[54954]: <info>  [1769121024.8963] device (tap74ec03d6-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:30:24 compute-0 ovn_controller[94850]: 2026-01-22T22:30:24Z|00243|binding|INFO|Releasing lport 74ec03d6-0608-4d5c-9426-b7bf6c291c36 from this chassis (sb_readonly=0)
Jan 22 22:30:24 compute-0 ovn_controller[94850]: 2026-01-22T22:30:24Z|00244|binding|INFO|Setting lport 74ec03d6-0608-4d5c-9426-b7bf6c291c36 down in Southbound
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.917 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:24 compute-0 ovn_controller[94850]: 2026-01-22T22:30:24Z|00245|binding|INFO|Removing iface tap74ec03d6-06 ovn-installed in OVS
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.921 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:24.929 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:8a:22 10.100.0.5'], port_security=['fa:16:3e:76:8a:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'edb59ec0-c6f0-4757-b5cc-293686870779', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a544701e-2e05-4802-ba07-c012963707f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '825c15e60ddd4efeb69accacdb4b129b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70afdb24-37c1-41f2-9284-84cfdd4b7137, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=74ec03d6-0608-4d5c-9426-b7bf6c291c36) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:24.932 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 74ec03d6-0608-4d5c-9426-b7bf6c291c36 in datapath a544701e-2e05-4802-ba07-c012963707f2 unbound from our chassis
Jan 22 22:30:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:24.935 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a544701e-2e05-4802-ba07-c012963707f2
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.943 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:24.962 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab55384-c7b2-4b51-8008-f57b338575e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:24 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000050.scope: Deactivated successfully.
Jan 22 22:30:24 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000050.scope: Consumed 10.882s CPU time.
Jan 22 22:30:24 compute-0 systemd-machined[154006]: Machine qemu-31-instance-00000050 terminated.
Jan 22 22:30:24 compute-0 nova_compute[182725]: 2026-01-22 22:30:24.996 182729 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Successfully created port: 4ba37fa6-0119-454f-8cc7-5ac2a143374a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:30:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:24.998 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[39a3e524-74cf-48ff-951e-1935cb24831a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.003 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[46337970-0413-401c-baa5-07611c7c8dac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.035 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6a3381-056f-4f78-9a9d-b7829b72f628]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.058 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bfde9d-f1c9-4bec-9312-7045715e8de1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa544701e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:37:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457642, 'reachable_time': 34940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221403, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.085 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9b7cb0-5d25-4d72-80d4-a8f685f6e84e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa544701e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457661, 'tstamp': 457661}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221404, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa544701e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457665, 'tstamp': 457665}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221404, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.090 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa544701e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.093 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.098 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.099 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa544701e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.100 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.101 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa544701e-20, col_values=(('external_ids', {'iface-id': '57be3db3-de80-45e5-a479-c3b4b4920475'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:25.101 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.175 182729 INFO nova.virt.libvirt.driver [-] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Instance destroyed successfully.
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.176 182729 DEBUG nova.objects.instance [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lazy-loading 'resources' on Instance uuid edb59ec0-c6f0-4757-b5cc-293686870779 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.193 182729 DEBUG nova.virt.libvirt.vif [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-141992064',display_name='tempest-ServerActionsTestOtherA-server-141992064',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-141992064',id=80,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='825c15e60ddd4efeb69accacdb4b129b',ramdisk_id='',reservation_id='r-ukr27gl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-658780637',owner_user_name='tempest-ServerActionsTestOtherA-658780637-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:30:15Z,user_data=None,user_id='fc3e5f9d1ee84e48a089c2636d28a7b0',uuid=edb59ec0-c6f0-4757-b5cc-293686870779,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.194 182729 DEBUG nova.network.os_vif_util [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converting VIF {"id": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "address": "fa:16:3e:76:8a:22", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74ec03d6-06", "ovs_interfaceid": "74ec03d6-0608-4d5c-9426-b7bf6c291c36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.195 182729 DEBUG nova.network.os_vif_util [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:8a:22,bridge_name='br-int',has_traffic_filtering=True,id=74ec03d6-0608-4d5c-9426-b7bf6c291c36,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74ec03d6-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.196 182729 DEBUG os_vif [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:8a:22,bridge_name='br-int',has_traffic_filtering=True,id=74ec03d6-0608-4d5c-9426-b7bf6c291c36,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74ec03d6-06') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.198 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.198 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ec03d6-06, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.201 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.204 182729 INFO os_vif [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:8a:22,bridge_name='br-int',has_traffic_filtering=True,id=74ec03d6-0608-4d5c-9426-b7bf6c291c36,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74ec03d6-06')
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.205 182729 INFO nova.virt.libvirt.driver [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Deleting instance files /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779_del
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.206 182729 INFO nova.virt.libvirt.driver [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Deletion of /var/lib/nova/instances/edb59ec0-c6f0-4757-b5cc-293686870779_del complete
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.287 182729 INFO nova.compute.manager [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.288 182729 DEBUG oslo.service.loopingcall [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.288 182729 DEBUG nova.compute.manager [-] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.288 182729 DEBUG nova.network.neutron [-] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.328 182729 DEBUG nova.compute.manager [req-cc8f0ebe-1851-4415-8ce5-609829f8e9e4 req-bafe40b2-0109-45e9-82ba-f3f52e405ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-vif-unplugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.328 182729 DEBUG oslo_concurrency.lockutils [req-cc8f0ebe-1851-4415-8ce5-609829f8e9e4 req-bafe40b2-0109-45e9-82ba-f3f52e405ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.328 182729 DEBUG oslo_concurrency.lockutils [req-cc8f0ebe-1851-4415-8ce5-609829f8e9e4 req-bafe40b2-0109-45e9-82ba-f3f52e405ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.329 182729 DEBUG oslo_concurrency.lockutils [req-cc8f0ebe-1851-4415-8ce5-609829f8e9e4 req-bafe40b2-0109-45e9-82ba-f3f52e405ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.329 182729 DEBUG nova.compute.manager [req-cc8f0ebe-1851-4415-8ce5-609829f8e9e4 req-bafe40b2-0109-45e9-82ba-f3f52e405ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] No waiting events found dispatching network-vif-unplugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.329 182729 DEBUG nova.compute.manager [req-cc8f0ebe-1851-4415-8ce5-609829f8e9e4 req-bafe40b2-0109-45e9-82ba-f3f52e405ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-vif-unplugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:30:25 compute-0 nova_compute[182725]: 2026-01-22 22:30:25.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:30:26 compute-0 podman[221422]: 2026-01-22 22:30:26.171354004 +0000 UTC m=+0.090385006 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 22 22:30:26 compute-0 podman[221423]: 2026-01-22 22:30:26.172630147 +0000 UTC m=+0.089023592 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.237 182729 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Successfully updated port: 4ba37fa6-0119-454f-8cc7-5ac2a143374a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.258 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "refresh_cache-9f290c4e-3649-4826-a9bd-1a7a4a6b7539" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.258 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquired lock "refresh_cache-9f290c4e-3649-4826-a9bd-1a7a4a6b7539" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.258 182729 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.361 182729 DEBUG nova.network.neutron [-] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.382 182729 INFO nova.compute.manager [-] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Took 1.09 seconds to deallocate network for instance.
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.419 182729 DEBUG nova.compute.manager [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received event network-changed-4ba37fa6-0119-454f-8cc7-5ac2a143374a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.419 182729 DEBUG nova.compute.manager [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Refreshing instance network info cache due to event network-changed-4ba37fa6-0119-454f-8cc7-5ac2a143374a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.419 182729 DEBUG oslo_concurrency.lockutils [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-9f290c4e-3649-4826-a9bd-1a7a4a6b7539" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.463 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.463 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.546 182729 DEBUG nova.compute.provider_tree [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.563 182729 DEBUG nova.scheduler.client.report [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.588 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.644 182729 INFO nova.scheduler.client.report [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Deleted allocations for instance edb59ec0-c6f0-4757-b5cc-293686870779
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.668 182729 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:30:26 compute-0 nova_compute[182725]: 2026-01-22 22:30:26.792 182729 DEBUG oslo_concurrency.lockutils [None req-e1058646-d9fe-444b-b0c2-16ff60b83e6c fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.962s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:27 compute-0 nova_compute[182725]: 2026-01-22 22:30:27.449 182729 DEBUG nova.compute.manager [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:27 compute-0 nova_compute[182725]: 2026-01-22 22:30:27.450 182729 DEBUG oslo_concurrency.lockutils [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:27 compute-0 nova_compute[182725]: 2026-01-22 22:30:27.450 182729 DEBUG oslo_concurrency.lockutils [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:27 compute-0 nova_compute[182725]: 2026-01-22 22:30:27.451 182729 DEBUG oslo_concurrency.lockutils [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "edb59ec0-c6f0-4757-b5cc-293686870779-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:27 compute-0 nova_compute[182725]: 2026-01-22 22:30:27.451 182729 DEBUG nova.compute.manager [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] No waiting events found dispatching network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:27 compute-0 nova_compute[182725]: 2026-01-22 22:30:27.452 182729 WARNING nova.compute.manager [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received unexpected event network-vif-plugged-74ec03d6-0608-4d5c-9426-b7bf6c291c36 for instance with vm_state deleted and task_state None.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.058 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.059 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.059 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.059 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.059 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.076 182729 INFO nova.compute.manager [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Terminating instance
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.087 182729 DEBUG nova.compute.manager [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:30:28 compute-0 kernel: tap44997b83-45 (unregistering): left promiscuous mode
Jan 22 22:30:28 compute-0 NetworkManager[54954]: <info>  [1769121028.1112] device (tap44997b83-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:30:28 compute-0 ovn_controller[94850]: 2026-01-22T22:30:28Z|00246|binding|INFO|Releasing lport 44997b83-4510-4cb4-9923-c9f1eb78e769 from this chassis (sb_readonly=0)
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.118 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 ovn_controller[94850]: 2026-01-22T22:30:28Z|00247|binding|INFO|Setting lport 44997b83-4510-4cb4-9923-c9f1eb78e769 down in Southbound
Jan 22 22:30:28 compute-0 ovn_controller[94850]: 2026-01-22T22:30:28Z|00248|binding|INFO|Removing iface tap44997b83-45 ovn-installed in OVS
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.122 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.129 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:7f:c5 10.100.0.6'], port_security=['fa:16:3e:e9:7f:c5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'fcd5dfc9-aa45-42d6-96d8-739f7eb5504a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a544701e-2e05-4802-ba07-c012963707f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '825c15e60ddd4efeb69accacdb4b129b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a38a8d4d-db17-4f3b-93ae-8cd0d57a26b6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70afdb24-37c1-41f2-9284-84cfdd4b7137, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=44997b83-4510-4cb4-9923-c9f1eb78e769) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.131 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 44997b83-4510-4cb4-9923-c9f1eb78e769 in datapath a544701e-2e05-4802-ba07-c012963707f2 unbound from our chassis
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.133 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a544701e-2e05-4802-ba07-c012963707f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.134 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82ce08c9-1d28-4781-94b6-e06776f9404d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.135 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a544701e-2e05-4802-ba07-c012963707f2 namespace which is not needed anymore
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.151 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 22 22:30:28 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000048.scope: Consumed 17.301s CPU time.
Jan 22 22:30:28 compute-0 systemd-machined[154006]: Machine qemu-29-instance-00000048 terminated.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.294 182729 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Updating instance_info_cache with network_info: [{"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:28 compute-0 kernel: tap44997b83-45: entered promiscuous mode
Jan 22 22:30:28 compute-0 neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2[220488]: [NOTICE]   (220492) : haproxy version is 2.8.14-c23fe91
Jan 22 22:30:28 compute-0 neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2[220488]: [NOTICE]   (220492) : path to executable is /usr/sbin/haproxy
Jan 22 22:30:28 compute-0 neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2[220488]: [WARNING]  (220492) : Exiting Master process...
Jan 22 22:30:28 compute-0 kernel: tap44997b83-45 (unregistering): left promiscuous mode
Jan 22 22:30:28 compute-0 neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2[220488]: [ALERT]    (220492) : Current worker (220494) exited with code 143 (Terminated)
Jan 22 22:30:28 compute-0 neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2[220488]: [WARNING]  (220492) : All workers exited. Exiting... (0)
Jan 22 22:30:28 compute-0 systemd[1]: libpod-c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f.scope: Deactivated successfully.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.327 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Releasing lock "refresh_cache-9f290c4e-3649-4826-a9bd-1a7a4a6b7539" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.327 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Instance network_info: |[{"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.328 182729 DEBUG oslo_concurrency.lockutils [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-9f290c4e-3649-4826-a9bd-1a7a4a6b7539" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:28 compute-0 podman[221488]: 2026-01-22 22:30:28.328577212 +0000 UTC m=+0.078720590 container died c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.329 182729 DEBUG nova.network.neutron [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Refreshing network info cache for port 4ba37fa6-0119-454f-8cc7-5ac2a143374a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.335 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Start _get_guest_xml network_info=[{"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.337 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.362 182729 WARNING nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:30:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f-userdata-shm.mount: Deactivated successfully.
Jan 22 22:30:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-5bb91b3b1dc4a21125cd99b8b364a0db20849ba69ce7c1133e106595a9b074ef-merged.mount: Deactivated successfully.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.380 182729 DEBUG nova.compute.manager [req-efe10b09-afd0-4a29-a46e-3731585c22bf req-6bb42774-c3a3-40e1-93e8-71613990bb2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-vif-unplugged-44997b83-4510-4cb4-9923-c9f1eb78e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.381 182729 DEBUG oslo_concurrency.lockutils [req-efe10b09-afd0-4a29-a46e-3731585c22bf req-6bb42774-c3a3-40e1-93e8-71613990bb2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.381 182729 DEBUG oslo_concurrency.lockutils [req-efe10b09-afd0-4a29-a46e-3731585c22bf req-6bb42774-c3a3-40e1-93e8-71613990bb2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.381 182729 DEBUG oslo_concurrency.lockutils [req-efe10b09-afd0-4a29-a46e-3731585c22bf req-6bb42774-c3a3-40e1-93e8-71613990bb2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.381 182729 DEBUG nova.compute.manager [req-efe10b09-afd0-4a29-a46e-3731585c22bf req-6bb42774-c3a3-40e1-93e8-71613990bb2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] No waiting events found dispatching network-vif-unplugged-44997b83-4510-4cb4-9923-c9f1eb78e769 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.382 182729 DEBUG nova.compute.manager [req-efe10b09-afd0-4a29-a46e-3731585c22bf req-6bb42774-c3a3-40e1-93e8-71613990bb2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-vif-unplugged-44997b83-4510-4cb4-9923-c9f1eb78e769 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.383 182729 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.383 182729 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:30:28 compute-0 podman[221488]: 2026-01-22 22:30:28.384759929 +0000 UTC m=+0.134903237 container cleanup c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.387 182729 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.388 182729 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.389 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.389 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.390 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.390 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.390 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.390 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.391 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.391 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.391 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.391 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.392 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.392 182729 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.396 182729 DEBUG nova.virt.libvirt.vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-227483673',display_name='tempest-ListServersNegativeTestJSON-server-227483673-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-227483673-1',id=82,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0604aab7ee464a1ca74c3ef627dcc854',ramdisk_id='',reservation_id='r-1h9pdnzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1929749532',owner_user_name='tempest-ListServersNegativeTestJSON-1929749532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:21Z,user_data=None,user_id='24157ae704064825a4f59adf1d187391',uuid=9f290c4e-3649-4826-a9bd-1a7a4a6b7539,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.396 182729 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converting VIF {"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.397 182729 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:56:d2,bridge_name='br-int',has_traffic_filtering=True,id=4ba37fa6-0119-454f-8cc7-5ac2a143374a,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba37fa6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.398 182729 DEBUG nova.objects.instance [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9f290c4e-3649-4826-a9bd-1a7a4a6b7539 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.403 182729 INFO nova.virt.libvirt.driver [-] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Instance destroyed successfully.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.403 182729 DEBUG nova.objects.instance [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lazy-loading 'resources' on Instance uuid fcd5dfc9-aa45-42d6-96d8-739f7eb5504a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:28 compute-0 systemd[1]: libpod-conmon-c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f.scope: Deactivated successfully.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.417 182729 DEBUG nova.virt.libvirt.vif [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:28:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-75282839',display_name='tempest-ServerActionsTestOtherA-server-75282839',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-75282839',id=72,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBkpc4JM5MIuxSEDA6kfPzyskWsK8tI2Fh/Lqyh17yIJ2pJxhfifIULVNg5h9fSbuGmGwCrb5kJWHYC3ZvkDDXwknQcbFIVzKg+3pyYtRS9H4Udhz3FXLW262IKMi1lDZQ==',key_name='tempest-keypair-191312094',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:28:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='825c15e60ddd4efeb69accacdb4b129b',ramdisk_id='',reservation_id='r-204hr0tv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-658780637',owner_user_name='tempest-ServerActionsTestOtherA-658780637-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:28:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fc3e5f9d1ee84e48a089c2636d28a7b0',uuid=fcd5dfc9-aa45-42d6-96d8-739f7eb5504a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.417 182729 DEBUG nova.network.os_vif_util [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converting VIF {"id": "44997b83-4510-4cb4-9923-c9f1eb78e769", "address": "fa:16:3e:e9:7f:c5", "network": {"id": "a544701e-2e05-4802-ba07-c012963707f2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-119292375-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "825c15e60ddd4efeb69accacdb4b129b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44997b83-45", "ovs_interfaceid": "44997b83-4510-4cb4-9923-c9f1eb78e769", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.418 182729 DEBUG nova.network.os_vif_util [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:7f:c5,bridge_name='br-int',has_traffic_filtering=True,id=44997b83-4510-4cb4-9923-c9f1eb78e769,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44997b83-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.418 182729 DEBUG os_vif [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:7f:c5,bridge_name='br-int',has_traffic_filtering=True,id=44997b83-4510-4cb4-9923-c9f1eb78e769,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44997b83-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.420 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <uuid>9f290c4e-3649-4826-a9bd-1a7a4a6b7539</uuid>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <name>instance-00000052</name>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <nova:name>tempest-ListServersNegativeTestJSON-server-227483673-1</nova:name>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:30:28</nova:creationTime>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:user uuid="24157ae704064825a4f59adf1d187391">tempest-ListServersNegativeTestJSON-1929749532-project-member</nova:user>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:project uuid="0604aab7ee464a1ca74c3ef627dcc854">tempest-ListServersNegativeTestJSON-1929749532</nova:project>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         <nova:port uuid="4ba37fa6-0119-454f-8cc7-5ac2a143374a">
Jan 22 22:30:28 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <system>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <entry name="serial">9f290c4e-3649-4826-a9bd-1a7a4a6b7539</entry>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <entry name="uuid">9f290c4e-3649-4826-a9bd-1a7a4a6b7539</entry>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </system>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <os>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   </os>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <features>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   </features>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk.config"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:5d:56:d2"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <target dev="tap4ba37fa6-01"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/console.log" append="off"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <video>
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </video>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:30:28 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:30:28 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:30:28 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:30:28 compute-0 nova_compute[182725]: </domain>
Jan 22 22:30:28 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.421 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Preparing to wait for external event network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.421 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.421 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.421 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.422 182729 DEBUG nova.virt.libvirt.vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-227483673',display_name='tempest-ListServersNegativeTestJSON-server-227483673-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-227483673-1',id=82,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0604aab7ee464a1ca74c3ef627dcc854',ramdisk_id='',reservation_id='r-1h9pdnzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1929749532',owner_user_name='tempest-ListServersNegativeTestJSON-1929749532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:21Z,user_data=None,user_id='24157ae704064825a4f59adf1d187391',uuid=9f290c4e-3649-4826-a9bd-1a7a4a6b7539,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.422 182729 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converting VIF {"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.422 182729 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:56:d2,bridge_name='br-int',has_traffic_filtering=True,id=4ba37fa6-0119-454f-8cc7-5ac2a143374a,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba37fa6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.423 182729 DEBUG os_vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:56:d2,bridge_name='br-int',has_traffic_filtering=True,id=4ba37fa6-0119-454f-8cc7-5ac2a143374a,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba37fa6-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.423 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.424 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44997b83-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.425 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.426 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.427 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.427 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.428 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.430 182729 INFO os_vif [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:7f:c5,bridge_name='br-int',has_traffic_filtering=True,id=44997b83-4510-4cb4-9923-c9f1eb78e769,network=Network(a544701e-2e05-4802-ba07-c012963707f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44997b83-45')
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.431 182729 INFO nova.virt.libvirt.driver [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Deleting instance files /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a_del
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.431 182729 INFO nova.virt.libvirt.driver [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Deletion of /var/lib/nova/instances/fcd5dfc9-aa45-42d6-96d8-739f7eb5504a_del complete
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.437 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.438 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ba37fa6-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.439 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ba37fa6-01, col_values=(('external_ids', {'iface-id': '4ba37fa6-0119-454f-8cc7-5ac2a143374a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:56:d2', 'vm-uuid': '9f290c4e-3649-4826-a9bd-1a7a4a6b7539'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.440 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 NetworkManager[54954]: <info>  [1769121028.4419] manager: (tap4ba37fa6-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.445 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.446 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.447 182729 INFO os_vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:56:d2,bridge_name='br-int',has_traffic_filtering=True,id=4ba37fa6-0119-454f-8cc7-5ac2a143374a,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba37fa6-01')
Jan 22 22:30:28 compute-0 podman[221530]: 2026-01-22 22:30:28.468229559 +0000 UTC m=+0.055635394 container remove c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.479 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a4212865-5188-49cb-afa5-de3346fb45c5]: (4, ('Thu Jan 22 10:30:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2 (c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f)\nc78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f\nThu Jan 22 10:30:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a544701e-2e05-4802-ba07-c012963707f2 (c78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f)\nc78840f85a059902a0281441a853f55f2013b3fd9c4716555066ecb6c165c89f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.482 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[39c10c93-aeba-4744-b2b0-5104de8e6b07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.483 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa544701e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:28 compute-0 kernel: tapa544701e-20: left promiscuous mode
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.486 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.512 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.515 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.516 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.516 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] No VIF found with MAC fa:16:3e:5d:56:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.517 182729 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Using config drive
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.516 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5d375e60-c368-497f-9da7-dcc0bcfbae44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.520 182729 INFO nova.compute.manager [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.520 182729 DEBUG oslo.service.loopingcall [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.521 182729 DEBUG nova.compute.manager [-] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.521 182729 DEBUG nova.network.neutron [-] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.531 182729 DEBUG nova.compute.manager [req-3028d4c2-b40b-4b4a-b7f8-0aa8790fb78a req-3f6f9219-d091-4015-a0ab-25cfc75ab280 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Received event network-vif-deleted-74ec03d6-0608-4d5c-9426-b7bf6c291c36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.547 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a18f492b-e21c-41db-bb07-e06717bb3fc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.549 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[33019248-cee1-402d-8366-4e1205e3a73b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.576 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[067b90e4-0598-4369-b272-6bc71f3134f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457633, 'reachable_time': 19492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221552, 'error': None, 'target': 'ovnmeta-a544701e-2e05-4802-ba07-c012963707f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.579 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a544701e-2e05-4802-ba07-c012963707f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:30:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:28.580 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[87d5d5a7-0c22-428b-b014-9f5052352716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:28 compute-0 systemd[1]: run-netns-ovnmeta\x2da544701e\x2d2e05\x2d4802\x2dba07\x2dc012963707f2.mount: Deactivated successfully.
Jan 22 22:30:28 compute-0 nova_compute[182725]: 2026-01-22 22:30:28.895 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:29 compute-0 nova_compute[182725]: 2026-01-22 22:30:29.791 182729 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Creating config drive at /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk.config
Jan 22 22:30:29 compute-0 nova_compute[182725]: 2026-01-22 22:30:29.796 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqyiqr5v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:29 compute-0 nova_compute[182725]: 2026-01-22 22:30:29.942 182729 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqyiqr5v" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:30 compute-0 kernel: tap4ba37fa6-01: entered promiscuous mode
Jan 22 22:30:30 compute-0 systemd-udevd[221470]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:30:30 compute-0 ovn_controller[94850]: 2026-01-22T22:30:30Z|00249|binding|INFO|Claiming lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a for this chassis.
Jan 22 22:30:30 compute-0 ovn_controller[94850]: 2026-01-22T22:30:30Z|00250|binding|INFO|4ba37fa6-0119-454f-8cc7-5ac2a143374a: Claiming fa:16:3e:5d:56:d2 10.100.0.14
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.055 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 NetworkManager[54954]: <info>  [1769121030.0566] manager: (tap4ba37fa6-01): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 22 22:30:30 compute-0 NetworkManager[54954]: <info>  [1769121030.0670] device (tap4ba37fa6-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.065 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:56:d2 10.100.0.14'], port_security=['fa:16:3e:5d:56:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9f290c4e-3649-4826-a9bd-1a7a4a6b7539', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b03cd250-02c3-425c-a1d4-c454aa74a746', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'neutron:revision_number': '2', 'neutron:security_group_ids': '12be1ce8-24f3-4356-bc90-b009c3a4fd93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=462ad325-898a-496d-9f84-227dfb38da3d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=4ba37fa6-0119-454f-8cc7-5ac2a143374a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.066 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba37fa6-0119-454f-8cc7-5ac2a143374a in datapath b03cd250-02c3-425c-a1d4-c454aa74a746 bound to our chassis
Jan 22 22:30:30 compute-0 NetworkManager[54954]: <info>  [1769121030.0686] device (tap4ba37fa6-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.068 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.069 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 ovn_controller[94850]: 2026-01-22T22:30:30Z|00251|binding|INFO|Setting lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a ovn-installed in OVS
Jan 22 22:30:30 compute-0 ovn_controller[94850]: 2026-01-22T22:30:30Z|00252|binding|INFO|Setting lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a up in Southbound
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.071 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.074 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.085 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc895dd-8c3c-4855-890e-2b3b05faa4ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.087 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb03cd250-01 in ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.090 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb03cd250-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.090 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94183ae0-b252-4aeb-8618-8c2d5d54ee9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.091 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ed181346-ff14-4055-a825-859d034d9fda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 systemd-machined[154006]: New machine qemu-32-instance-00000052.
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.109 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[30c9cf01-02bc-42ec-897a-9e8b9339dc6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-00000052.
Jan 22 22:30:30 compute-0 podman[221563]: 2026-01-22 22:30:30.127155459 +0000 UTC m=+0.085439101 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.125 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[07e65d90-3897-4b3c-9911-7358b68f7098]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.157 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[77b18e4e-bc44-4dd4-a4a9-b37dfda3d260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.164 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f0d9a7-0f83-458a-9434-8c78f51a01b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 NetworkManager[54954]: <info>  [1769121030.1651] manager: (tapb03cd250-00): new Veth device (/org/freedesktop/NetworkManager/Devices/126)
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.199 182729 DEBUG nova.compute.manager [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.207 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c969682f-b389-4145-a356-ae445bea8fa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.211 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e5adb271-22a5-4816-b905-0caa6b5d0cc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 NetworkManager[54954]: <info>  [1769121030.2399] device (tapb03cd250-00): carrier: link connected
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.246 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7436e4fc-9972-4a71-9bb3-955aad3f7ae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.275 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dcfa6df1-3e8b-4d2c-a11b-b54d14df9cb9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb03cd250-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:67:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466784, 'reachable_time': 27915, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221627, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.294 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2a212f6d-cfe2-4ecf-8b8c-4128d5b89260]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:6705'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466784, 'tstamp': 466784}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221628, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.319 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.319 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.316 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4bcb9074-e8c7-400d-8bb7-ad7a5f425f57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb03cd250-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:67:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466784, 'reachable_time': 27915, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221629, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.343 182729 DEBUG nova.objects.instance [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.367 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c75aef68-87be-49ea-abbd-22f95d991fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.379 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.380 182729 INFO nova.compute.claims [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.381 182729 DEBUG nova.objects.instance [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.399 182729 DEBUG nova.objects.instance [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.438 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[88bc7bff-26e3-471b-bdd0-6bfd40ab420d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.440 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03cd250-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.442 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.443 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb03cd250-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.445 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 kernel: tapb03cd250-00: entered promiscuous mode
Jan 22 22:30:30 compute-0 NetworkManager[54954]: <info>  [1769121030.4471] manager: (tapb03cd250-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.450 182729 INFO nova.compute.resource_tracker [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating resource usage from migration 969cd541-4fe9-48ce-bf39-c0d737aa6531
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.451 182729 DEBUG nova.compute.resource_tracker [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Starting to track incoming migration 969cd541-4fe9-48ce-bf39-c0d737aa6531 with flavor 617fb2f8-2c15-4939-a64a-90fca4acd12a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.452 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.453 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb03cd250-00, col_values=(('external_ids', {'iface-id': 'a20b41a8-fffe-4d8c-83ca-cc00cb778065'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.455 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 ovn_controller[94850]: 2026-01-22T22:30:30Z|00253|binding|INFO|Releasing lport a20b41a8-fffe-4d8c-83ca-cc00cb778065 from this chassis (sb_readonly=0)
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.456 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.458 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.460 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b719ac57-9a9b-482b-8d1b-5f91f20bce0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.460 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:30:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:30.462 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'env', 'PROCESS_TAG=haproxy-b03cd250-02c3-425c-a1d4-c454aa74a746', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b03cd250-02c3-425c-a1d4-c454aa74a746.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.475 182729 DEBUG nova.compute.manager [req-0885dca9-049e-49b1-8b79-72c2c1b43ec6 req-7d36e468-0a7f-4622-916b-4df5bdd7e4cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.478 182729 DEBUG oslo_concurrency.lockutils [req-0885dca9-049e-49b1-8b79-72c2c1b43ec6 req-7d36e468-0a7f-4622-916b-4df5bdd7e4cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.478 182729 DEBUG oslo_concurrency.lockutils [req-0885dca9-049e-49b1-8b79-72c2c1b43ec6 req-7d36e468-0a7f-4622-916b-4df5bdd7e4cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.479 182729 DEBUG oslo_concurrency.lockutils [req-0885dca9-049e-49b1-8b79-72c2c1b43ec6 req-7d36e468-0a7f-4622-916b-4df5bdd7e4cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.479 182729 DEBUG nova.compute.manager [req-0885dca9-049e-49b1-8b79-72c2c1b43ec6 req-7d36e468-0a7f-4622-916b-4df5bdd7e4cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] No waiting events found dispatching network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.479 182729 WARNING nova.compute.manager [req-0885dca9-049e-49b1-8b79-72c2c1b43ec6 req-7d36e468-0a7f-4622-916b-4df5bdd7e4cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received unexpected event network-vif-plugged-44997b83-4510-4cb4-9923-c9f1eb78e769 for instance with vm_state active and task_state deleting.
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.480 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.501 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121030.5008602, 9f290c4e-3649-4826-a9bd-1a7a4a6b7539 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.501 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] VM Started (Lifecycle Event)
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.529 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.538 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121030.5016549, 9f290c4e-3649-4826-a9bd-1a7a4a6b7539 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.538 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] VM Paused (Lifecycle Event)
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.559 182729 DEBUG nova.compute.provider_tree [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.578 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.584 182729 DEBUG nova.scheduler.client.report [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.593 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.614 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.634 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.635 182729 INFO nova.compute.manager [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Migrating
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.641 182729 DEBUG nova.network.neutron [-] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.724 182729 INFO nova.compute.manager [-] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Took 2.20 seconds to deallocate network for instance.
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.842 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:30 compute-0 nova_compute[182725]: 2026-01-22 22:30:30.844 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:30 compute-0 podman[221668]: 2026-01-22 22:30:30.918038155 +0000 UTC m=+0.078878484 container create 36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 22:30:30 compute-0 systemd[1]: Started libpod-conmon-36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75.scope.
Jan 22 22:30:30 compute-0 podman[221668]: 2026-01-22 22:30:30.886075353 +0000 UTC m=+0.046915722 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:30:30 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:30:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d0327bee804ebb3f553edc817ba3dd2b182a39d1df22bb155d901d9f2374ffd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:30:31 compute-0 podman[221668]: 2026-01-22 22:30:31.017973803 +0000 UTC m=+0.178814172 container init 36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.023 182729 DEBUG nova.compute.provider_tree [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:30:31 compute-0 podman[221668]: 2026-01-22 22:30:31.025493664 +0000 UTC m=+0.186334003 container start 36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.040 182729 DEBUG nova.scheduler.client.report [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:30:31 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221683]: [NOTICE]   (221687) : New worker (221689) forked
Jan 22 22:30:31 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221683]: [NOTICE]   (221687) : Loading success.
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.071 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.098 182729 INFO nova.scheduler.client.report [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Deleted allocations for instance fcd5dfc9-aa45-42d6-96d8-739f7eb5504a
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.116 182729 DEBUG nova.network.neutron [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Updated VIF entry in instance network info cache for port 4ba37fa6-0119-454f-8cc7-5ac2a143374a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.116 182729 DEBUG nova.network.neutron [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Updating instance_info_cache with network_info: [{"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.134 182729 DEBUG oslo_concurrency.lockutils [req-ca3e2b85-1f2e-4813-a6f1-6e4505aa0a28 req-c3283c6d-0ff8-46c5-989e-36ea17e5b760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-9f290c4e-3649-4826-a9bd-1a7a4a6b7539" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:31 compute-0 nova_compute[182725]: 2026-01-22 22:30:31.192 182729 DEBUG oslo_concurrency.lockutils [None req-b0a369b9-ddd4-4900-841c-19eeff8a557f fc3e5f9d1ee84e48a089c2636d28a7b0 825c15e60ddd4efeb69accacdb4b129b - - default default] Lock "fcd5dfc9-aa45-42d6-96d8-739f7eb5504a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.762 182729 DEBUG nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Received event network-vif-deleted-44997b83-4510-4cb4-9923-c9f1eb78e769 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.763 182729 DEBUG nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received event network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.764 182729 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.764 182729 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.764 182729 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.765 182729 DEBUG nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Processing event network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.765 182729 DEBUG nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received event network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.766 182729 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.766 182729 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.767 182729 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.767 182729 DEBUG nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] No waiting events found dispatching network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.767 182729 WARNING nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received unexpected event network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a for instance with vm_state building and task_state spawning.
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.768 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.774 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.775 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121032.7739701, 9f290c4e-3649-4826-a9bd-1a7a4a6b7539 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.775 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] VM Resumed (Lifecycle Event)
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.780 182729 INFO nova.virt.libvirt.driver [-] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Instance spawned successfully.
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.781 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.806 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.813 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.819 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.819 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.820 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.820 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.820 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.820 182729 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.845 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.935 182729 INFO nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Took 11.28 seconds to spawn the instance on the hypervisor.
Jan 22 22:30:32 compute-0 nova_compute[182725]: 2026-01-22 22:30:32.935 182729 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:33 compute-0 nova_compute[182725]: 2026-01-22 22:30:33.026 182729 INFO nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Took 12.06 seconds to build instance.
Jan 22 22:30:33 compute-0 nova_compute[182725]: 2026-01-22 22:30:33.043 182729 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:33 compute-0 nova_compute[182725]: 2026-01-22 22:30:33.441 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:33 compute-0 sshd-session[221698]: Accepted publickey for nova from 192.168.122.102 port 47400 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:30:33 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 22:30:33 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 22:30:33 compute-0 systemd-logind[801]: New session 41 of user nova.
Jan 22 22:30:33 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 22:30:33 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 22:30:33 compute-0 systemd[221702]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:30:33 compute-0 systemd[221702]: Queued start job for default target Main User Target.
Jan 22 22:30:33 compute-0 systemd[221702]: Created slice User Application Slice.
Jan 22 22:30:33 compute-0 systemd[221702]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:30:33 compute-0 systemd[221702]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 22:30:33 compute-0 systemd[221702]: Reached target Paths.
Jan 22 22:30:33 compute-0 systemd[221702]: Reached target Timers.
Jan 22 22:30:33 compute-0 systemd[221702]: Starting D-Bus User Message Bus Socket...
Jan 22 22:30:33 compute-0 systemd[221702]: Starting Create User's Volatile Files and Directories...
Jan 22 22:30:33 compute-0 systemd[221702]: Listening on D-Bus User Message Bus Socket.
Jan 22 22:30:33 compute-0 systemd[221702]: Reached target Sockets.
Jan 22 22:30:33 compute-0 systemd[221702]: Finished Create User's Volatile Files and Directories.
Jan 22 22:30:33 compute-0 systemd[221702]: Reached target Basic System.
Jan 22 22:30:33 compute-0 systemd[221702]: Reached target Main User Target.
Jan 22 22:30:33 compute-0 systemd[221702]: Startup finished in 174ms.
Jan 22 22:30:33 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 22:30:33 compute-0 systemd[1]: Started Session 41 of User nova.
Jan 22 22:30:33 compute-0 sshd-session[221698]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:30:33 compute-0 nova_compute[182725]: 2026-01-22 22:30:33.898 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:33 compute-0 sshd-session[221717]: Received disconnect from 192.168.122.102 port 47400:11: disconnected by user
Jan 22 22:30:33 compute-0 sshd-session[221717]: Disconnected from user nova 192.168.122.102 port 47400
Jan 22 22:30:33 compute-0 sshd-session[221698]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:30:33 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Jan 22 22:30:33 compute-0 systemd-logind[801]: Session 41 logged out. Waiting for processes to exit.
Jan 22 22:30:33 compute-0 systemd-logind[801]: Removed session 41.
Jan 22 22:30:34 compute-0 sshd-session[221719]: Accepted publickey for nova from 192.168.122.102 port 47412 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:30:34 compute-0 systemd-logind[801]: New session 43 of user nova.
Jan 22 22:30:34 compute-0 systemd[1]: Started Session 43 of User nova.
Jan 22 22:30:34 compute-0 sshd-session[221719]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:30:34 compute-0 sshd-session[221722]: Received disconnect from 192.168.122.102 port 47412:11: disconnected by user
Jan 22 22:30:34 compute-0 sshd-session[221722]: Disconnected from user nova 192.168.122.102 port 47412
Jan 22 22:30:34 compute-0 sshd-session[221719]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:30:34 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Jan 22 22:30:34 compute-0 systemd-logind[801]: Session 43 logged out. Waiting for processes to exit.
Jan 22 22:30:34 compute-0 systemd-logind[801]: Removed session 43.
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.291 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.291 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.291 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.292 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.292 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.315 182729 INFO nova.compute.manager [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Terminating instance
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.336 182729 DEBUG nova.compute.manager [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:30:34 compute-0 kernel: tap4ba37fa6-01 (unregistering): left promiscuous mode
Jan 22 22:30:34 compute-0 NetworkManager[54954]: <info>  [1769121034.3655] device (tap4ba37fa6-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00254|binding|INFO|Releasing lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a from this chassis (sb_readonly=0)
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.374 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00255|binding|INFO|Setting lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a down in Southbound
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00256|binding|INFO|Removing iface tap4ba37fa6-01 ovn-installed in OVS
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.378 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.385 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:56:d2 10.100.0.14'], port_security=['fa:16:3e:5d:56:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9f290c4e-3649-4826-a9bd-1a7a4a6b7539', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b03cd250-02c3-425c-a1d4-c454aa74a746', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12be1ce8-24f3-4356-bc90-b009c3a4fd93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=462ad325-898a-496d-9f84-227dfb38da3d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=4ba37fa6-0119-454f-8cc7-5ac2a143374a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.386 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba37fa6-0119-454f-8cc7-5ac2a143374a in datapath b03cd250-02c3-425c-a1d4-c454aa74a746 unbound from our chassis
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.388 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b03cd250-02c3-425c-a1d4-c454aa74a746, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.389 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c156b694-36e3-4e5e-bedb-450073f1bc01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.389 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 namespace which is not needed anymore
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.399 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000052.scope: Deactivated successfully.
Jan 22 22:30:34 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000052.scope: Consumed 1.909s CPU time.
Jan 22 22:30:34 compute-0 systemd-machined[154006]: Machine qemu-32-instance-00000052 terminated.
Jan 22 22:30:34 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221683]: [NOTICE]   (221687) : haproxy version is 2.8.14-c23fe91
Jan 22 22:30:34 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221683]: [NOTICE]   (221687) : path to executable is /usr/sbin/haproxy
Jan 22 22:30:34 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221683]: [WARNING]  (221687) : Exiting Master process...
Jan 22 22:30:34 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221683]: [ALERT]    (221687) : Current worker (221689) exited with code 143 (Terminated)
Jan 22 22:30:34 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221683]: [WARNING]  (221687) : All workers exited. Exiting... (0)
Jan 22 22:30:34 compute-0 systemd[1]: libpod-36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75.scope: Deactivated successfully.
Jan 22 22:30:34 compute-0 podman[221748]: 2026-01-22 22:30:34.519641024 +0000 UTC m=+0.042114701 container died 36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:30:34 compute-0 kernel: tap4ba37fa6-01: entered promiscuous mode
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.567 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 kernel: tap4ba37fa6-01 (unregistering): left promiscuous mode
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00257|binding|INFO|Claiming lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a for this chassis.
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00258|binding|INFO|4ba37fa6-0119-454f-8cc7-5ac2a143374a: Claiming fa:16:3e:5d:56:d2 10.100.0.14
Jan 22 22:30:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75-userdata-shm.mount: Deactivated successfully.
Jan 22 22:30:34 compute-0 NetworkManager[54954]: <info>  [1769121034.5736] manager: (tap4ba37fa6-01): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Jan 22 22:30:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-5d0327bee804ebb3f553edc817ba3dd2b182a39d1df22bb155d901d9f2374ffd-merged.mount: Deactivated successfully.
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00259|binding|INFO|Setting lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a ovn-installed in OVS
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00260|if_status|INFO|Not setting lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a down as sb is readonly
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.591 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 podman[221748]: 2026-01-22 22:30:34.59862581 +0000 UTC m=+0.121099497 container cleanup 36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:30:34 compute-0 systemd[1]: libpod-conmon-36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75.scope: Deactivated successfully.
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.626 182729 INFO nova.virt.libvirt.driver [-] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Instance destroyed successfully.
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.627 182729 DEBUG nova.objects.instance [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lazy-loading 'resources' on Instance uuid 9f290c4e-3649-4826-a9bd-1a7a4a6b7539 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:34 compute-0 podman[221788]: 2026-01-22 22:30:34.675939533 +0000 UTC m=+0.052630637 container remove 36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.682 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d622640e-1bfd-4cd7-a0df-c96a4d3f2bf4]: (4, ('Thu Jan 22 10:30:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 (36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75)\n36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75\nThu Jan 22 10:30:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 (36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75)\n36e25ceed0ee72b32b42e9dcd923ac6edea65ac225f7eb816745612fc050dc75\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.684 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[16c0e871-2c80-41a3-b9f9-2fb72f23177e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.685 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03cd250-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:34 compute-0 kernel: tapb03cd250-00: left promiscuous mode
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.688 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 ovn_controller[94850]: 2026-01-22T22:30:34Z|00261|binding|INFO|Releasing lport 4ba37fa6-0119-454f-8cc7-5ac2a143374a from this chassis (sb_readonly=0)
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.708 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.720 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.721 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.723 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d44a902a-664c-4ca5-86ae-cb88577066eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.739 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0429bffb-e7b8-434f-888c-790284c43fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.740 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c5e041-d066-4884-ad40-d47b31c30d8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.766 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:56:d2 10.100.0.14'], port_security=['fa:16:3e:5d:56:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9f290c4e-3649-4826-a9bd-1a7a4a6b7539', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b03cd250-02c3-425c-a1d4-c454aa74a746', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12be1ce8-24f3-4356-bc90-b009c3a4fd93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=462ad325-898a-496d-9f84-227dfb38da3d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=4ba37fa6-0119-454f-8cc7-5ac2a143374a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.765 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f98649-b8e8-4096-aeb0-01610ab4b36a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466775, 'reachable_time': 24025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221808, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.772 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.772 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[d43cbc65-314a-4a73-ab74-22bb88f1833b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 systemd[1]: run-netns-ovnmeta\x2db03cd250\x2d02c3\x2d425c\x2da1d4\x2dc454aa74a746.mount: Deactivated successfully.
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.774 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba37fa6-0119-454f-8cc7-5ac2a143374a in datapath b03cd250-02c3-425c-a1d4-c454aa74a746 bound to our chassis
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.776 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.786 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:56:d2 10.100.0.14'], port_security=['fa:16:3e:5d:56:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9f290c4e-3649-4826-a9bd-1a7a4a6b7539', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b03cd250-02c3-425c-a1d4-c454aa74a746', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12be1ce8-24f3-4356-bc90-b009c3a4fd93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=462ad325-898a-496d-9f84-227dfb38da3d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=4ba37fa6-0119-454f-8cc7-5ac2a143374a) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.790 182729 DEBUG nova.virt.libvirt.vif [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-227483673',display_name='tempest-ListServersNegativeTestJSON-server-227483673-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-227483673-1',id=82,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0604aab7ee464a1ca74c3ef627dcc854',ramdisk_id='',reservation_id='r-1h9pdnzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1929749532',owner_user_name='tempest-ListServersNegativeTestJSON-1929749532-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:30:32Z,user_data=None,user_id='24157ae704064825a4f59adf1d187391',uuid=9f290c4e-3649-4826-a9bd-1a7a4a6b7539,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.791 182729 DEBUG nova.network.os_vif_util [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converting VIF {"id": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "address": "fa:16:3e:5d:56:d2", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba37fa6-01", "ovs_interfaceid": "4ba37fa6-0119-454f-8cc7-5ac2a143374a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.792 182729 DEBUG nova.network.os_vif_util [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:56:d2,bridge_name='br-int',has_traffic_filtering=True,id=4ba37fa6-0119-454f-8cc7-5ac2a143374a,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba37fa6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.792 182729 DEBUG os_vif [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:56:d2,bridge_name='br-int',has_traffic_filtering=True,id=4ba37fa6-0119-454f-8cc7-5ac2a143374a,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba37fa6-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.793 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.794 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ba37fa6-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.795 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.796 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.797 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5be08ec-ffd3-4685-be23-96a5cc7b9f0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.799 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb03cd250-01 in ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.799 182729 INFO os_vif [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:56:d2,bridge_name='br-int',has_traffic_filtering=True,id=4ba37fa6-0119-454f-8cc7-5ac2a143374a,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba37fa6-01')
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.800 182729 INFO nova.virt.libvirt.driver [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Deleting instance files /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539_del
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.801 182729 INFO nova.virt.libvirt.driver [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Deletion of /var/lib/nova/instances/9f290c4e-3649-4826-a9bd-1a7a4a6b7539_del complete
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.802 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb03cd250-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.802 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3e2b7783-7ab7-4187-9fdd-26b9d9cf1cd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.804 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9cb3ec-5e88-4c27-a94f-b6e0e53d6ec6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.824 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1ea0b0-2028-40ee-a20c-507ce9767760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.848 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5900d127-078c-4a57-a745-39ba6b4361ce]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.879 182729 INFO nova.compute.manager [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Took 0.54 seconds to destroy the instance on the hypervisor.
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.881 182729 DEBUG oslo.service.loopingcall [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.881 182729 DEBUG nova.compute.manager [-] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.882 182729 DEBUG nova.network.neutron [-] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.896 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f2dbe6c8-ce39-4e79-a133-e30ef090fe41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 systemd-udevd[221727]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.905 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[10c24286-7154-4653-97bf-27705a88d9a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 NetworkManager[54954]: <info>  [1769121034.9072] manager: (tapb03cd250-00): new Veth device (/org/freedesktop/NetworkManager/Devices/129)
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.950 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b210dc42-9e35-415f-8515-51e8391e1b7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.953 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c8388f25-0194-4af2-b5ad-0b1eefd29e96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.962 182729 DEBUG nova.compute.manager [req-d909baed-5ab0-423c-92ad-692f8257f6c0 req-abe2da24-4d02-4d5f-b9c3-1472805eab92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received event network-vif-unplugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.963 182729 DEBUG oslo_concurrency.lockutils [req-d909baed-5ab0-423c-92ad-692f8257f6c0 req-abe2da24-4d02-4d5f-b9c3-1472805eab92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.963 182729 DEBUG oslo_concurrency.lockutils [req-d909baed-5ab0-423c-92ad-692f8257f6c0 req-abe2da24-4d02-4d5f-b9c3-1472805eab92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.963 182729 DEBUG oslo_concurrency.lockutils [req-d909baed-5ab0-423c-92ad-692f8257f6c0 req-abe2da24-4d02-4d5f-b9c3-1472805eab92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.964 182729 DEBUG nova.compute.manager [req-d909baed-5ab0-423c-92ad-692f8257f6c0 req-abe2da24-4d02-4d5f-b9c3-1472805eab92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] No waiting events found dispatching network-vif-unplugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:34 compute-0 nova_compute[182725]: 2026-01-22 22:30:34.964 182729 DEBUG nova.compute.manager [req-d909baed-5ab0-423c-92ad-692f8257f6c0 req-abe2da24-4d02-4d5f-b9c3-1472805eab92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received event network-vif-unplugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:30:34 compute-0 NetworkManager[54954]: <info>  [1769121034.9759] device (tapb03cd250-00): carrier: link connected
Jan 22 22:30:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:34.980 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[cb198341-d0c0-44f8-aac0-73409af2630d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.009 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ae6e0865-de7e-4015-bc5c-92c5b68f2cbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb03cd250-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:67:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467257, 'reachable_time': 23770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221833, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.026 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e0f522-0c3e-40f9-b45d-520051c2b9a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:6705'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467257, 'tstamp': 467257}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221834, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.044 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa5caeb-45a1-4b6c-918f-c62daea64aff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb03cd250-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:67:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467257, 'reachable_time': 23770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221835, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.085 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8f295a8c-a877-45ba-bb20-0b94dc978c2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.175 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f141c703-c96e-45d7-89da-b355d49d6812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.178 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03cd250-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.178 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.179 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb03cd250-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.183 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:35 compute-0 NetworkManager[54954]: <info>  [1769121035.1842] manager: (tapb03cd250-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 22 22:30:35 compute-0 kernel: tapb03cd250-00: entered promiscuous mode
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.187 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb03cd250-00, col_values=(('external_ids', {'iface-id': 'a20b41a8-fffe-4d8c-83ca-cc00cb778065'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.188 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:35 compute-0 ovn_controller[94850]: 2026-01-22T22:30:35Z|00262|binding|INFO|Releasing lport a20b41a8-fffe-4d8c-83ca-cc00cb778065 from this chassis (sb_readonly=0)
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.190 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.191 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.192 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ec387e00-d9a0-4080-bda3-eeb54c4f4102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.193 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.194 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'env', 'PROCESS_TAG=haproxy-b03cd250-02c3-425c-a1d4-c454aa74a746', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b03cd250-02c3-425c-a1d4-c454aa74a746.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.478 182729 DEBUG nova.network.neutron [-] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.494 182729 INFO nova.compute.manager [-] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Took 0.61 seconds to deallocate network for instance.
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.568 182729 DEBUG nova.compute.manager [req-9891176a-8cb1-4b90-9a39-8c638941021f req-cb6b2e7e-0763-4076-a1ec-dbf507a6f0f3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received event network-vif-deleted-4ba37fa6-0119-454f-8cc7-5ac2a143374a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.583 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.583 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.656 182729 DEBUG nova.compute.provider_tree [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.669 182729 DEBUG nova.scheduler.client.report [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:30:35 compute-0 podman[221867]: 2026-01-22 22:30:35.670275586 +0000 UTC m=+0.083752068 container create 79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.687 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.712 182729 INFO nova.scheduler.client.report [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Deleted allocations for instance 9f290c4e-3649-4826-a9bd-1a7a4a6b7539
Jan 22 22:30:35 compute-0 systemd[1]: Started libpod-conmon-79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c.scope.
Jan 22 22:30:35 compute-0 podman[221867]: 2026-01-22 22:30:35.629982153 +0000 UTC m=+0.043458725 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:30:35 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:30:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2260728f0545b31aac412248ffa898f24049284d25d937af52e759f8d22a5263/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:30:35 compute-0 podman[221867]: 2026-01-22 22:30:35.767075155 +0000 UTC m=+0.180551647 container init 79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 22:30:35 compute-0 podman[221867]: 2026-01-22 22:30:35.773495378 +0000 UTC m=+0.186971850 container start 79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:30:35 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221884]: [NOTICE]   (221888) : New worker (221890) forked
Jan 22 22:30:35 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221884]: [NOTICE]   (221888) : Loading success.
Jan 22 22:30:35 compute-0 nova_compute[182725]: 2026-01-22 22:30:35.809 182729 DEBUG oslo_concurrency.lockutils [None req-93e10fa6-b1ca-4757-b3f2-cd3ac8650660 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.518s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.845 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba37fa6-0119-454f-8cc7-5ac2a143374a in datapath b03cd250-02c3-425c-a1d4-c454aa74a746 unbound from our chassis
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.849 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b03cd250-02c3-425c-a1d4-c454aa74a746, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.850 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a8a9f1-9c27-4d23-9cec-f37f516d0fdb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:35.851 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 namespace which is not needed anymore
Jan 22 22:30:36 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221884]: [NOTICE]   (221888) : haproxy version is 2.8.14-c23fe91
Jan 22 22:30:36 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221884]: [NOTICE]   (221888) : path to executable is /usr/sbin/haproxy
Jan 22 22:30:36 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221884]: [WARNING]  (221888) : Exiting Master process...
Jan 22 22:30:36 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221884]: [ALERT]    (221888) : Current worker (221890) exited with code 143 (Terminated)
Jan 22 22:30:36 compute-0 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[221884]: [WARNING]  (221888) : All workers exited. Exiting... (0)
Jan 22 22:30:36 compute-0 systemd[1]: libpod-79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c.scope: Deactivated successfully.
Jan 22 22:30:36 compute-0 podman[221916]: 2026-01-22 22:30:36.016148331 +0000 UTC m=+0.051651273 container died 79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:30:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c-userdata-shm.mount: Deactivated successfully.
Jan 22 22:30:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-2260728f0545b31aac412248ffa898f24049284d25d937af52e759f8d22a5263-merged.mount: Deactivated successfully.
Jan 22 22:30:36 compute-0 podman[221916]: 2026-01-22 22:30:36.061313608 +0000 UTC m=+0.096816540 container cleanup 79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:30:36 compute-0 systemd[1]: libpod-conmon-79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c.scope: Deactivated successfully.
Jan 22 22:30:36 compute-0 podman[221945]: 2026-01-22 22:30:36.162282812 +0000 UTC m=+0.065974726 container remove 79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.169 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bad02048-6e75-4bbc-8c37-07ec8c9c0ecd]: (4, ('Thu Jan 22 10:30:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 (79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c)\n79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c\nThu Jan 22 10:30:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 (79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c)\n79507d12f3acd606523985cb39a71a17c34a9d7a9ab813fe175c680069eca19c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.172 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f08974e3-a7cd-4dc0-b212-1ae7f752015b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.174 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03cd250-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.177 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:36 compute-0 kernel: tapb03cd250-00: left promiscuous mode
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.180 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.184 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[abc378cf-54f7-444a-9930-95536036638e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.191 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.199 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8a696d-3c7a-4cc8-9fe2-66d62867ce24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.200 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[35900a4d-f2c6-4653-8a09-e557bbe116b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.218 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e12533-1766-4108-8da0-5cd1b201d1d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467249, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221960, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.221 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:30:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:36.221 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f254b4-73f1-416b-91cc-d2bdc7069554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:36 compute-0 systemd[1]: run-netns-ovnmeta\x2db03cd250\x2d02c3\x2d425c\x2da1d4\x2dc454aa74a746.mount: Deactivated successfully.
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.597 182729 DEBUG nova.compute.manager [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.597 182729 DEBUG oslo_concurrency.lockutils [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.597 182729 DEBUG oslo_concurrency.lockutils [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.598 182729 DEBUG oslo_concurrency.lockutils [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.598 182729 DEBUG nova.compute.manager [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:36 compute-0 nova_compute[182725]: 2026-01-22 22:30:36.598 182729 WARNING nova.compute.manager [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state active and task_state resize_migrating.
Jan 22 22:30:37 compute-0 nova_compute[182725]: 2026-01-22 22:30:37.050 182729 DEBUG nova.compute.manager [req-d461c184-f326-4ed2-bd97-453260870cf5 req-e577659e-8bb6-46b0-b13b-78b57bb90c90 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received event network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:37 compute-0 nova_compute[182725]: 2026-01-22 22:30:37.051 182729 DEBUG oslo_concurrency.lockutils [req-d461c184-f326-4ed2-bd97-453260870cf5 req-e577659e-8bb6-46b0-b13b-78b57bb90c90 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:37 compute-0 nova_compute[182725]: 2026-01-22 22:30:37.051 182729 DEBUG oslo_concurrency.lockutils [req-d461c184-f326-4ed2-bd97-453260870cf5 req-e577659e-8bb6-46b0-b13b-78b57bb90c90 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:37 compute-0 nova_compute[182725]: 2026-01-22 22:30:37.052 182729 DEBUG oslo_concurrency.lockutils [req-d461c184-f326-4ed2-bd97-453260870cf5 req-e577659e-8bb6-46b0-b13b-78b57bb90c90 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9f290c4e-3649-4826-a9bd-1a7a4a6b7539-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:37 compute-0 nova_compute[182725]: 2026-01-22 22:30:37.052 182729 DEBUG nova.compute.manager [req-d461c184-f326-4ed2-bd97-453260870cf5 req-e577659e-8bb6-46b0-b13b-78b57bb90c90 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] No waiting events found dispatching network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:37 compute-0 nova_compute[182725]: 2026-01-22 22:30:37.052 182729 WARNING nova.compute.manager [req-d461c184-f326-4ed2-bd97-453260870cf5 req-e577659e-8bb6-46b0-b13b-78b57bb90c90 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Received unexpected event network-vif-plugged-4ba37fa6-0119-454f-8cc7-5ac2a143374a for instance with vm_state deleted and task_state None.
Jan 22 22:30:37 compute-0 sshd-session[221961]: Accepted publickey for nova from 192.168.122.102 port 51428 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:30:37 compute-0 systemd-logind[801]: New session 44 of user nova.
Jan 22 22:30:37 compute-0 systemd[1]: Started Session 44 of User nova.
Jan 22 22:30:37 compute-0 sshd-session[221961]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:30:38 compute-0 sshd-session[221964]: Received disconnect from 192.168.122.102 port 51428:11: disconnected by user
Jan 22 22:30:38 compute-0 sshd-session[221964]: Disconnected from user nova 192.168.122.102 port 51428
Jan 22 22:30:38 compute-0 sshd-session[221961]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:30:38 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Jan 22 22:30:38 compute-0 systemd-logind[801]: Session 44 logged out. Waiting for processes to exit.
Jan 22 22:30:38 compute-0 systemd-logind[801]: Removed session 44.
Jan 22 22:30:38 compute-0 sshd-session[221966]: Accepted publickey for nova from 192.168.122.102 port 51430 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:30:38 compute-0 systemd-logind[801]: New session 45 of user nova.
Jan 22 22:30:38 compute-0 systemd[1]: Started Session 45 of User nova.
Jan 22 22:30:38 compute-0 sshd-session[221966]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.210 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:38 compute-0 sshd-session[221969]: Received disconnect from 192.168.122.102 port 51430:11: disconnected by user
Jan 22 22:30:38 compute-0 sshd-session[221969]: Disconnected from user nova 192.168.122.102 port 51430
Jan 22 22:30:38 compute-0 sshd-session[221966]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:30:38 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Jan 22 22:30:38 compute-0 systemd-logind[801]: Session 45 logged out. Waiting for processes to exit.
Jan 22 22:30:38 compute-0 systemd-logind[801]: Removed session 45.
Jan 22 22:30:38 compute-0 sshd-session[221971]: Accepted publickey for nova from 192.168.122.102 port 51444 ssh2: ECDSA SHA256:bJw+nK3nNB1k9cgrfD2Ic13lp5cPdevsa8z3rSFhRAk
Jan 22 22:30:38 compute-0 systemd-logind[801]: New session 46 of user nova.
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.390 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:38 compute-0 systemd[1]: Started Session 46 of User nova.
Jan 22 22:30:38 compute-0 sshd-session[221971]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 22:30:38 compute-0 sshd-session[221975]: Received disconnect from 192.168.122.102 port 51444:11: disconnected by user
Jan 22 22:30:38 compute-0 sshd-session[221975]: Disconnected from user nova 192.168.122.102 port 51444
Jan 22 22:30:38 compute-0 sshd-session[221971]: pam_unix(sshd:session): session closed for user nova
Jan 22 22:30:38 compute-0 systemd[1]: session-46.scope: Deactivated successfully.
Jan 22 22:30:38 compute-0 systemd-logind[801]: Session 46 logged out. Waiting for processes to exit.
Jan 22 22:30:38 compute-0 systemd-logind[801]: Removed session 46.
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.745 182729 DEBUG nova.compute.manager [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.745 182729 DEBUG oslo_concurrency.lockutils [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.746 182729 DEBUG oslo_concurrency.lockutils [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.746 182729 DEBUG oslo_concurrency.lockutils [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.746 182729 DEBUG nova.compute.manager [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.746 182729 WARNING nova.compute.manager [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state active and task_state resize_migrating.
Jan 22 22:30:38 compute-0 nova_compute[182725]: 2026-01-22 22:30:38.900 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:39 compute-0 nova_compute[182725]: 2026-01-22 22:30:39.167 182729 INFO nova.network.neutron [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating port 50b7281e-d0dc-4caf-a920-24203f11da00 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 22:30:39 compute-0 nova_compute[182725]: 2026-01-22 22:30:39.796 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:40 compute-0 nova_compute[182725]: 2026-01-22 22:30:40.172 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121025.1714013, edb59ec0-c6f0-4757-b5cc-293686870779 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:40 compute-0 nova_compute[182725]: 2026-01-22 22:30:40.173 182729 INFO nova.compute.manager [-] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] VM Stopped (Lifecycle Event)
Jan 22 22:30:40 compute-0 nova_compute[182725]: 2026-01-22 22:30:40.252 182729 DEBUG nova.compute.manager [None req-08c5c2d9-fa09-4aa9-abf6-cf516b2fb12e - - - - - -] [instance: edb59ec0-c6f0-4757-b5cc-293686870779] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:40 compute-0 nova_compute[182725]: 2026-01-22 22:30:40.281 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:40 compute-0 nova_compute[182725]: 2026-01-22 22:30:40.282 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:40 compute-0 nova_compute[182725]: 2026-01-22 22:30:40.282 182729 DEBUG nova.network.neutron [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:30:41 compute-0 nova_compute[182725]: 2026-01-22 22:30:41.499 182729 DEBUG nova.compute.manager [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-changed-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:41 compute-0 nova_compute[182725]: 2026-01-22 22:30:41.500 182729 DEBUG nova.compute.manager [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Refreshing instance network info cache due to event network-changed-50b7281e-d0dc-4caf-a920-24203f11da00. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:30:41 compute-0 nova_compute[182725]: 2026-01-22 22:30:41.501 182729 DEBUG oslo_concurrency.lockutils [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.087 182729 DEBUG nova.network.neutron [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.116 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.122 182729 DEBUG oslo_concurrency.lockutils [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.122 182729 DEBUG nova.network.neutron [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Refreshing network info cache for port 50b7281e-d0dc-4caf-a920-24203f11da00 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.274 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.276 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.277 182729 INFO nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Creating image(s)
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.279 182729 DEBUG nova.objects.instance [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.326 182729 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.393 182729 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.395 182729 DEBUG nova.virt.disk.api [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.396 182729 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.472 182729 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.474 182729 DEBUG nova.virt.disk.api [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.694 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.694 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Ensure instance console log exists: /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.695 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.696 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.697 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.702 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Start _get_guest_xml network_info=[{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:74:11:5a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.709 182729 WARNING nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.717 182729 DEBUG nova.virt.libvirt.host [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.718 182729 DEBUG nova.virt.libvirt.host [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.722 182729 DEBUG nova.virt.libvirt.host [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.723 182729 DEBUG nova.virt.libvirt.host [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.725 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.725 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='617fb2f8-2c15-4939-a64a-90fca4acd12a',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.726 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.726 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.726 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.727 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.727 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.727 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.727 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.728 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.728 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.728 182729 DEBUG nova.virt.hardware [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.729 182729 DEBUG nova.objects.instance [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.792 182729 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.862 182729 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.864 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.864 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.866 182729 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.868 182729 DEBUG nova.virt.libvirt.vif [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1146472499',display_name='tempest-ServerDiskConfigTestJSON-server-1146472499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1146472499',id=81,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-rnxdkv5c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:38Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=5a9390a0-5077-46b6-8f6c-b3b308db8b1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:74:11:5a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.869 182729 DEBUG nova.network.os_vif_util [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:74:11:5a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.871 182729 DEBUG nova.network.os_vif_util [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.876 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <uuid>5a9390a0-5077-46b6-8f6c-b3b308db8b1d</uuid>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <name>instance-00000051</name>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <memory>196608</memory>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1146472499</nova:name>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:30:42</nova:creationTime>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <nova:flavor name="m1.micro">
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:memory>192</nova:memory>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         <nova:port uuid="50b7281e-d0dc-4caf-a920-24203f11da00">
Jan 22 22:30:42 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <system>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <entry name="serial">5a9390a0-5077-46b6-8f6c-b3b308db8b1d</entry>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <entry name="uuid">5a9390a0-5077-46b6-8f6c-b3b308db8b1d</entry>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </system>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <os>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   </os>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <features>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   </features>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:74:11:5a"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <target dev="tap50b7281e-d0"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/console.log" append="off"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <video>
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </video>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:30:42 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:30:42 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:30:42 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:30:42 compute-0 nova_compute[182725]: </domain>
Jan 22 22:30:42 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.879 182729 DEBUG nova.virt.libvirt.vif [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1146472499',display_name='tempest-ServerDiskConfigTestJSON-server-1146472499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1146472499',id=81,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-rnxdkv5c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:38Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=5a9390a0-5077-46b6-8f6c-b3b308db8b1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:74:11:5a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.880 182729 DEBUG nova.network.os_vif_util [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:74:11:5a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.881 182729 DEBUG nova.network.os_vif_util [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.882 182729 DEBUG os_vif [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.883 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.884 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.884 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.888 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.889 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b7281e-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.890 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50b7281e-d0, col_values=(('external_ids', {'iface-id': '50b7281e-d0dc-4caf-a920-24203f11da00', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:11:5a', 'vm-uuid': '5a9390a0-5077-46b6-8f6c-b3b308db8b1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.892 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:42 compute-0 NetworkManager[54954]: <info>  [1769121042.8937] manager: (tap50b7281e-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.895 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.900 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:42 compute-0 nova_compute[182725]: 2026-01-22 22:30:42.901 182729 INFO os_vif [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0')
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.015 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.015 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.015 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:74:11:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.016 182729 INFO nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Using config drive
Jan 22 22:30:43 compute-0 kernel: tap50b7281e-d0: entered promiscuous mode
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.079 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 NetworkManager[54954]: <info>  [1769121043.0801] manager: (tap50b7281e-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Jan 22 22:30:43 compute-0 ovn_controller[94850]: 2026-01-22T22:30:43Z|00263|binding|INFO|Claiming lport 50b7281e-d0dc-4caf-a920-24203f11da00 for this chassis.
Jan 22 22:30:43 compute-0 ovn_controller[94850]: 2026-01-22T22:30:43Z|00264|binding|INFO|50b7281e-d0dc-4caf-a920-24203f11da00: Claiming fa:16:3e:74:11:5a 10.100.0.13
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.083 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.097 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:11:5a 10.100.0.13'], port_security=['fa:16:3e:74:11:5a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5a9390a0-5077-46b6-8f6c-b3b308db8b1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=50b7281e-d0dc-4caf-a920-24203f11da00) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.099 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 50b7281e-d0dc-4caf-a920-24203f11da00 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.100 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 22:30:43 compute-0 systemd-udevd[222000]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.113 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[66128508-2cca-43f7-91ed-924650ad386a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.114 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.116 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.116 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e6612ea8-307d-4aee-a081-00f34285f2f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.118 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[51bd53e7-89be-4249-be26-092479053c65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 NetworkManager[54954]: <info>  [1769121043.1243] device (tap50b7281e-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:30:43 compute-0 NetworkManager[54954]: <info>  [1769121043.1251] device (tap50b7281e-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.131 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0b9361-80cb-4104-8e58-9171b3f629c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 systemd-machined[154006]: New machine qemu-33-instance-00000051.
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.139 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 ovn_controller[94850]: 2026-01-22T22:30:43Z|00265|binding|INFO|Setting lport 50b7281e-d0dc-4caf-a920-24203f11da00 ovn-installed in OVS
Jan 22 22:30:43 compute-0 ovn_controller[94850]: 2026-01-22T22:30:43Z|00266|binding|INFO|Setting lport 50b7281e-d0dc-4caf-a920-24203f11da00 up in Southbound
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.145 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-00000051.
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.147 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6d74a046-ebab-4a22-a15e-137d0ce86bbd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.176 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[311cb903-4dd2-4ff9-9a64-b9963c03b52e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.182 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[53077840-ff0c-468f-a145-874ac8cb7612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 NetworkManager[54954]: <info>  [1769121043.1837] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.212 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8f7938-d053-4fde-ac84-82c76d79e929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.217 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d9d957-559d-4ac6-863c-7ad858330ec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 NetworkManager[54954]: <info>  [1769121043.2409] device (tap354683a7-30): carrier: link connected
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.248 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ea8860-94df-40a0-825e-2c87de521ee0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.268 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6bf493-705a-41cd-a361-c019ed1ffeb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468084, 'reachable_time': 36623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222036, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.286 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9e177d-3d2a-41c8-ae5c-a387463db68c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468084, 'tstamp': 468084}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222037, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.309 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4efe0707-2700-4efa-bfdc-ba810fece0f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468084, 'reachable_time': 36623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222038, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.346 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d66462b6-2ef9-4845-8118-381682d273b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.400 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121028.391148, fcd5dfc9-aa45-42d6-96d8-739f7eb5504a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.401 182729 INFO nova.compute.manager [-] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] VM Stopped (Lifecycle Event)
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.424 182729 DEBUG nova.compute.manager [None req-8feb1c43-fbfd-4606-b190-328f800ca30b - - - - - -] [instance: fcd5dfc9-aa45-42d6-96d8-739f7eb5504a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.427 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0aa60d-64d1-4d67-9c26-501867a56a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.429 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.429 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.430 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.432 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 NetworkManager[54954]: <info>  [1769121043.4331] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 22 22:30:43 compute-0 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.435 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.435 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.436 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 ovn_controller[94850]: 2026-01-22T22:30:43Z|00267|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=0)
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.450 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.452 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.453 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[df032399-fc40-4bb8-a4ac-b69750bdff89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.454 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:30:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:43.455 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.617 182729 DEBUG nova.compute.manager [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.618 182729 DEBUG oslo_concurrency.lockutils [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.618 182729 DEBUG oslo_concurrency.lockutils [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.618 182729 DEBUG oslo_concurrency.lockutils [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.618 182729 DEBUG nova.compute.manager [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.619 182729 WARNING nova.compute.manager [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state active and task_state resize_finish.
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.779 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121043.7786524, 5a9390a0-5077-46b6-8f6c-b3b308db8b1d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.780 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] VM Resumed (Lifecycle Event)
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.782 182729 DEBUG nova.compute.manager [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.786 182729 INFO nova.virt.libvirt.driver [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance running successfully.
Jan 22 22:30:43 compute-0 virtqemud[182297]: argument unsupported: QEMU guest agent is not configured
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.788 182729 DEBUG nova.virt.libvirt.guest [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.788 182729 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 22:30:43 compute-0 podman[222077]: 2026-01-22 22:30:43.852334975 +0000 UTC m=+0.054581317 container create fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.876 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.886 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.902 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:43 compute-0 systemd[1]: Started libpod-conmon-fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221.scope.
Jan 22 22:30:43 compute-0 podman[222077]: 2026-01-22 22:30:43.823883333 +0000 UTC m=+0.026129675 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.919 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.920 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121043.779922, 5a9390a0-5077-46b6-8f6c-b3b308db8b1d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.920 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] VM Started (Lifecycle Event)
Jan 22 22:30:43 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:30:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab5fe36ab893c5864d7a69f30e1816add54a866851c841572676d9f4617ba739/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.956 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:43 compute-0 podman[222077]: 2026-01-22 22:30:43.95880983 +0000 UTC m=+0.161056172 container init fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:30:43 compute-0 nova_compute[182725]: 2026-01-22 22:30:43.963 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:30:43 compute-0 podman[222077]: 2026-01-22 22:30:43.964982486 +0000 UTC m=+0.167228828 container start fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:30:43 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [NOTICE]   (222111) : New worker (222118) forked
Jan 22 22:30:43 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [NOTICE]   (222111) : Loading success.
Jan 22 22:30:43 compute-0 podman[222091]: 2026-01-22 22:30:43.995823479 +0000 UTC m=+0.097167038 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:30:44 compute-0 nova_compute[182725]: 2026-01-22 22:30:44.332 182729 DEBUG nova.network.neutron [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updated VIF entry in instance network info cache for port 50b7281e-d0dc-4caf-a920-24203f11da00. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:30:44 compute-0 nova_compute[182725]: 2026-01-22 22:30:44.333 182729 DEBUG nova.network.neutron [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:44 compute-0 nova_compute[182725]: 2026-01-22 22:30:44.355 182729 DEBUG oslo_concurrency.lockutils [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:30:45 compute-0 nova_compute[182725]: 2026-01-22 22:30:45.800 182729 DEBUG nova.compute.manager [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:45 compute-0 nova_compute[182725]: 2026-01-22 22:30:45.801 182729 DEBUG oslo_concurrency.lockutils [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:45 compute-0 nova_compute[182725]: 2026-01-22 22:30:45.801 182729 DEBUG oslo_concurrency.lockutils [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:45 compute-0 nova_compute[182725]: 2026-01-22 22:30:45.802 182729 DEBUG oslo_concurrency.lockutils [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:45 compute-0 nova_compute[182725]: 2026-01-22 22:30:45.802 182729 DEBUG nova.compute.manager [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:45 compute-0 nova_compute[182725]: 2026-01-22 22:30:45.803 182729 WARNING nova.compute.manager [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state resized and task_state None.
Jan 22 22:30:46 compute-0 podman[222129]: 2026-01-22 22:30:46.167761499 +0000 UTC m=+0.089995106 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc.)
Jan 22 22:30:46 compute-0 podman[222128]: 2026-01-22 22:30:46.188984388 +0000 UTC m=+0.123674922 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:30:47 compute-0 nova_compute[182725]: 2026-01-22 22:30:47.892 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:48 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 22:30:48 compute-0 systemd[221702]: Activating special unit Exit the Session...
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped target Main User Target.
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped target Basic System.
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped target Paths.
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped target Sockets.
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped target Timers.
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 22:30:48 compute-0 systemd[221702]: Closed D-Bus User Message Bus Socket.
Jan 22 22:30:48 compute-0 systemd[221702]: Stopped Create User's Volatile Files and Directories.
Jan 22 22:30:48 compute-0 systemd[221702]: Removed slice User Application Slice.
Jan 22 22:30:48 compute-0 systemd[221702]: Reached target Shutdown.
Jan 22 22:30:48 compute-0 systemd[221702]: Finished Exit the Session.
Jan 22 22:30:48 compute-0 systemd[221702]: Reached target Exit the Session.
Jan 22 22:30:48 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 22:30:48 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 22:30:48 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 22:30:48 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 22:30:48 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 22:30:48 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 22:30:48 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 22:30:48 compute-0 nova_compute[182725]: 2026-01-22 22:30:48.904 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:49 compute-0 nova_compute[182725]: 2026-01-22 22:30:49.623 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121034.622331, 9f290c4e-3649-4826-a9bd-1a7a4a6b7539 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:30:49 compute-0 nova_compute[182725]: 2026-01-22 22:30:49.624 182729 INFO nova.compute.manager [-] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] VM Stopped (Lifecycle Event)
Jan 22 22:30:49 compute-0 nova_compute[182725]: 2026-01-22 22:30:49.674 182729 DEBUG nova.compute.manager [None req-4b19a8e8-c54f-4e08-acc7-bfce7af26b4e - - - - - -] [instance: 9f290c4e-3649-4826-a9bd-1a7a4a6b7539] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:30:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:51.946 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:51 compute-0 nova_compute[182725]: 2026-01-22 22:30:51.947 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:51.948 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:30:52 compute-0 nova_compute[182725]: 2026-01-22 22:30:52.895 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.907 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.957 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.958 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.958 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.958 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.959 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.968 182729 INFO nova.compute.manager [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Terminating instance
Jan 22 22:30:53 compute-0 nova_compute[182725]: 2026-01-22 22:30:53.977 182729 DEBUG nova.compute.manager [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:30:53 compute-0 kernel: tap50b7281e-d0 (unregistering): left promiscuous mode
Jan 22 22:30:54 compute-0 NetworkManager[54954]: <info>  [1769121054.0013] device (tap50b7281e-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.010 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:54 compute-0 ovn_controller[94850]: 2026-01-22T22:30:54Z|00268|binding|INFO|Releasing lport 50b7281e-d0dc-4caf-a920-24203f11da00 from this chassis (sb_readonly=0)
Jan 22 22:30:54 compute-0 ovn_controller[94850]: 2026-01-22T22:30:54Z|00269|binding|INFO|Setting lport 50b7281e-d0dc-4caf-a920-24203f11da00 down in Southbound
Jan 22 22:30:54 compute-0 ovn_controller[94850]: 2026-01-22T22:30:54Z|00270|binding|INFO|Removing iface tap50b7281e-d0 ovn-installed in OVS
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.020 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:11:5a 10.100.0.13'], port_security=['fa:16:3e:74:11:5a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5a9390a0-5077-46b6-8f6c-b3b308db8b1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=50b7281e-d0dc-4caf-a920-24203f11da00) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.022 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 50b7281e-d0dc-4caf-a920-24203f11da00 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.023 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.025 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.025 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e04876-a7b4-4616-b76b-763daba99814]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.026 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore
Jan 22 22:30:54 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000051.scope: Deactivated successfully.
Jan 22 22:30:54 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000051.scope: Consumed 11.009s CPU time.
Jan 22 22:30:54 compute-0 systemd-machined[154006]: Machine qemu-33-instance-00000051 terminated.
Jan 22 22:30:54 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [NOTICE]   (222111) : haproxy version is 2.8.14-c23fe91
Jan 22 22:30:54 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [NOTICE]   (222111) : path to executable is /usr/sbin/haproxy
Jan 22 22:30:54 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [WARNING]  (222111) : Exiting Master process...
Jan 22 22:30:54 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [WARNING]  (222111) : Exiting Master process...
Jan 22 22:30:54 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [ALERT]    (222111) : Current worker (222118) exited with code 143 (Terminated)
Jan 22 22:30:54 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222099]: [WARNING]  (222111) : All workers exited. Exiting... (0)
Jan 22 22:30:54 compute-0 systemd[1]: libpod-fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221.scope: Deactivated successfully.
Jan 22 22:30:54 compute-0 podman[222198]: 2026-01-22 22:30:54.203614682 +0000 UTC m=+0.050525180 container died fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221-userdata-shm.mount: Deactivated successfully.
Jan 22 22:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-ab5fe36ab893c5864d7a69f30e1816add54a866851c841572676d9f4617ba739-merged.mount: Deactivated successfully.
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.259 182729 INFO nova.virt.libvirt.driver [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance destroyed successfully.
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.260 182729 DEBUG nova.objects.instance [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:30:54 compute-0 podman[222198]: 2026-01-22 22:30:54.26504152 +0000 UTC m=+0.111952028 container cleanup fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.273 182729 DEBUG nova.virt.libvirt.vif [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1146472499',display_name='tempest-ServerDiskConfigTestJSON-server-1146472499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1146472499',id=81,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-rnxdkv5c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:30:48Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=5a9390a0-5077-46b6-8f6c-b3b308db8b1d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.273 182729 DEBUG nova.network.os_vif_util [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.274 182729 DEBUG nova.network.os_vif_util [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.274 182729 DEBUG os_vif [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:30:54 compute-0 systemd[1]: libpod-conmon-fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221.scope: Deactivated successfully.
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.276 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.276 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b7281e-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.277 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.280 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.283 182729 INFO os_vif [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0')
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.284 182729 INFO nova.virt.libvirt.driver [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Deleting instance files /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_del
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.285 182729 INFO nova.virt.libvirt.driver [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Deletion of /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_del complete
Jan 22 22:30:54 compute-0 podman[222246]: 2026-01-22 22:30:54.356078001 +0000 UTC m=+0.051917445 container remove fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.363 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbf6976-3332-4741-af4e-553ddee7f118]: (4, ('Thu Jan 22 10:30:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221)\nfed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221\nThu Jan 22 10:30:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (fed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221)\nfed88eb25a7db9ab877eeb74aaef13746f0013cd0ada44ade4bd5f488d15f221\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.365 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb34748-072a-4b70-b7f0-231cadb5914a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.367 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:30:54 compute-0 kernel: tap354683a7-30: left promiscuous mode
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.370 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.378 182729 INFO nova.compute.manager [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.379 182729 DEBUG oslo.service.loopingcall [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.380 182729 DEBUG nova.compute.manager [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.380 182729 DEBUG nova.network.neutron [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.387 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.391 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ff62aae2-05c5-46e0-b260-7b3b3e3f2d60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.415 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d2746922-fdaf-49c5-88eb-c3ea22c2393a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.419 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0e35b9b3-9b29-4c28-98fb-b88c34e743ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.447 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4464b37d-f61c-42ea-a47a-55aef0e63991]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468077, 'reachable_time': 37258, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222260, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.450 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:30:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:54.450 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[6a162368-b36f-4020-99ce-f3b5e2e1bf71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.900 182729 DEBUG nova.compute.manager [req-7b18f8f2-1e17-495b-bdf8-8a6a0a701d06 req-3fdfe5b6-739c-4ac1-a4b8-78e908f361d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.900 182729 DEBUG oslo_concurrency.lockutils [req-7b18f8f2-1e17-495b-bdf8-8a6a0a701d06 req-3fdfe5b6-739c-4ac1-a4b8-78e908f361d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.900 182729 DEBUG oslo_concurrency.lockutils [req-7b18f8f2-1e17-495b-bdf8-8a6a0a701d06 req-3fdfe5b6-739c-4ac1-a4b8-78e908f361d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.901 182729 DEBUG oslo_concurrency.lockutils [req-7b18f8f2-1e17-495b-bdf8-8a6a0a701d06 req-3fdfe5b6-739c-4ac1-a4b8-78e908f361d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.901 182729 DEBUG nova.compute.manager [req-7b18f8f2-1e17-495b-bdf8-8a6a0a701d06 req-3fdfe5b6-739c-4ac1-a4b8-78e908f361d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.901 182729 DEBUG nova.compute.manager [req-7b18f8f2-1e17-495b-bdf8-8a6a0a701d06 req-3fdfe5b6-739c-4ac1-a4b8-78e908f361d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:30:54 compute-0 nova_compute[182725]: 2026-01-22 22:30:54.998 182729 DEBUG nova.network.neutron [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:30:55 compute-0 nova_compute[182725]: 2026-01-22 22:30:55.017 182729 INFO nova.compute.manager [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Took 0.64 seconds to deallocate network for instance.
Jan 22 22:30:55 compute-0 nova_compute[182725]: 2026-01-22 22:30:55.096 182729 DEBUG nova.compute.manager [req-7ac4e16c-9ac6-4d90-9b2e-520d5128e5e2 req-df857a86-3a4f-4b70-8dbf-53431d5aaba2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-deleted-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:55 compute-0 nova_compute[182725]: 2026-01-22 22:30:55.099 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:55 compute-0 nova_compute[182725]: 2026-01-22 22:30:55.099 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:55 compute-0 nova_compute[182725]: 2026-01-22 22:30:55.106 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:55 compute-0 nova_compute[182725]: 2026-01-22 22:30:55.147 182729 INFO nova.scheduler.client.report [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Deleted allocations for instance 5a9390a0-5077-46b6-8f6c-b3b308db8b1d
Jan 22 22:30:55 compute-0 nova_compute[182725]: 2026-01-22 22:30:55.262 182729 DEBUG oslo_concurrency.lockutils [None req-03768da9-769c-457d-99d4-efacc20e8c6b b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:57 compute-0 nova_compute[182725]: 2026-01-22 22:30:57.006 182729 DEBUG nova.compute.manager [req-fcf830e0-7cd1-49d1-9d8d-4cc032e4270f req-dd9fe9cd-b584-4c24-a005-251c5011f594 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:30:57 compute-0 nova_compute[182725]: 2026-01-22 22:30:57.006 182729 DEBUG oslo_concurrency.lockutils [req-fcf830e0-7cd1-49d1-9d8d-4cc032e4270f req-dd9fe9cd-b584-4c24-a005-251c5011f594 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:30:57 compute-0 nova_compute[182725]: 2026-01-22 22:30:57.006 182729 DEBUG oslo_concurrency.lockutils [req-fcf830e0-7cd1-49d1-9d8d-4cc032e4270f req-dd9fe9cd-b584-4c24-a005-251c5011f594 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:30:57 compute-0 nova_compute[182725]: 2026-01-22 22:30:57.007 182729 DEBUG oslo_concurrency.lockutils [req-fcf830e0-7cd1-49d1-9d8d-4cc032e4270f req-dd9fe9cd-b584-4c24-a005-251c5011f594 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:30:57 compute-0 nova_compute[182725]: 2026-01-22 22:30:57.007 182729 DEBUG nova.compute.manager [req-fcf830e0-7cd1-49d1-9d8d-4cc032e4270f req-dd9fe9cd-b584-4c24-a005-251c5011f594 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:30:57 compute-0 nova_compute[182725]: 2026-01-22 22:30:57.007 182729 WARNING nova.compute.manager [req-fcf830e0-7cd1-49d1-9d8d-4cc032e4270f req-dd9fe9cd-b584-4c24-a005-251c5011f594 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state deleted and task_state None.
Jan 22 22:30:57 compute-0 podman[222262]: 2026-01-22 22:30:57.18172865 +0000 UTC m=+0.093906293 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:30:57 compute-0 podman[222263]: 2026-01-22 22:30:57.192280611 +0000 UTC m=+0.101330266 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:30:58 compute-0 nova_compute[182725]: 2026-01-22 22:30:58.909 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:59 compute-0 nova_compute[182725]: 2026-01-22 22:30:59.278 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:30:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:30:59.950 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:01 compute-0 podman[222306]: 2026-01-22 22:31:01.176695085 +0000 UTC m=+0.100508636 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:31:03 compute-0 nova_compute[182725]: 2026-01-22 22:31:03.912 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.017 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.017 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.034 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.133 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.134 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.144 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.145 182729 INFO nova.compute.claims [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.234 182729 DEBUG nova.scheduler.client.report [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.259 182729 DEBUG nova.scheduler.client.report [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.259 182729 DEBUG nova.compute.provider_tree [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.278 182729 DEBUG nova.scheduler.client.report [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.281 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.311 182729 DEBUG nova.scheduler.client.report [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.363 182729 DEBUG nova.compute.provider_tree [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.389 182729 DEBUG nova.scheduler.client.report [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.409 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.410 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.465 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.466 182729 DEBUG nova.network.neutron [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.484 182729 INFO nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.507 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.660 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.662 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.663 182729 INFO nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Creating image(s)
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.664 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.664 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.666 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.693 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.732 182729 DEBUG nova.policy [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.796 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.797 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.798 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.823 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.915 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:04 compute-0 nova_compute[182725]: 2026-01-22 22:31:04.916 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.190 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk 1073741824" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.192 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.194 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.264 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.266 182729 DEBUG nova.virt.disk.api [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.266 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.326 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.328 182729 DEBUG nova.virt.disk.api [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.329 182729 DEBUG nova.objects.instance [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.348 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.349 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Ensure instance console log exists: /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.350 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.350 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.351 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:05 compute-0 nova_compute[182725]: 2026-01-22 22:31:05.642 182729 DEBUG nova.network.neutron [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Successfully created port: b56a4401-4c89-482a-a347-ca080a879f8f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.506 182729 DEBUG nova.network.neutron [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Successfully updated port: b56a4401-4c89-482a-a347-ca080a879f8f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.525 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.525 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.526 182729 DEBUG nova.network.neutron [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.678 182729 DEBUG nova.network.neutron [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.821 182729 DEBUG nova.compute.manager [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-changed-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.821 182729 DEBUG nova.compute.manager [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Refreshing instance network info cache due to event network-changed-b56a4401-4c89-482a-a347-ca080a879f8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:31:06 compute-0 nova_compute[182725]: 2026-01-22 22:31:06.822 182729 DEBUG oslo_concurrency.lockutils [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.612 182729 DEBUG nova.network.neutron [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.632 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.632 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance network_info: |[{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.633 182729 DEBUG oslo_concurrency.lockutils [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.633 182729 DEBUG nova.network.neutron [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Refreshing network info cache for port b56a4401-4c89-482a-a347-ca080a879f8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.637 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Start _get_guest_xml network_info=[{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.643 182729 WARNING nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.653 182729 DEBUG nova.virt.libvirt.host [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.653 182729 DEBUG nova.virt.libvirt.host [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.657 182729 DEBUG nova.virt.libvirt.host [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.658 182729 DEBUG nova.virt.libvirt.host [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.660 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.660 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.660 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.661 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.661 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.661 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.662 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.662 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.662 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.663 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.663 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.663 182729 DEBUG nova.virt.hardware [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.668 182729 DEBUG nova.virt.libvirt.vif [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.668 182729 DEBUG nova.network.os_vif_util [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.669 182729 DEBUG nova.network.os_vif_util [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.671 182729 DEBUG nova.objects.instance [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.687 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <uuid>454ec87b-a45c-40af-8bce-d252eea19620</uuid>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <name>instance-00000057</name>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-256562799</nova:name>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:31:07</nova:creationTime>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         <nova:port uuid="b56a4401-4c89-482a-a347-ca080a879f8f">
Jan 22 22:31:07 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <system>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <entry name="serial">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <entry name="uuid">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </system>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <os>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   </os>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <features>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   </features>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:f0:b5:89"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <target dev="tapb56a4401-4c"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/console.log" append="off"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <video>
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </video>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:31:07 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:31:07 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:31:07 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:31:07 compute-0 nova_compute[182725]: </domain>
Jan 22 22:31:07 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.689 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Preparing to wait for external event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.690 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.690 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.691 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.692 182729 DEBUG nova.virt.libvirt.vif [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.692 182729 DEBUG nova.network.os_vif_util [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.693 182729 DEBUG nova.network.os_vif_util [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.693 182729 DEBUG os_vif [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.694 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.695 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.695 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.699 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.700 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb56a4401-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.700 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb56a4401-4c, col_values=(('external_ids', {'iface-id': 'b56a4401-4c89-482a-a347-ca080a879f8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:b5:89', 'vm-uuid': '454ec87b-a45c-40af-8bce-d252eea19620'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.702 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:07 compute-0 NetworkManager[54954]: <info>  [1769121067.7041] manager: (tapb56a4401-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.704 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.709 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.709 182729 INFO os_vif [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.764 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.765 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.765 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No VIF found with MAC fa:16:3e:f0:b5:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:31:07 compute-0 nova_compute[182725]: 2026-01-22 22:31:07.767 182729 INFO nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Using config drive
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.292 182729 INFO nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Creating config drive at /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.303 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv4mpdn93 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.434 182729 DEBUG oslo_concurrency.processutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv4mpdn93" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:08 compute-0 kernel: tapb56a4401-4c: entered promiscuous mode
Jan 22 22:31:08 compute-0 NetworkManager[54954]: <info>  [1769121068.5225] manager: (tapb56a4401-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.524 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:08 compute-0 ovn_controller[94850]: 2026-01-22T22:31:08Z|00271|binding|INFO|Claiming lport b56a4401-4c89-482a-a347-ca080a879f8f for this chassis.
Jan 22 22:31:08 compute-0 ovn_controller[94850]: 2026-01-22T22:31:08Z|00272|binding|INFO|b56a4401-4c89-482a-a347-ca080a879f8f: Claiming fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.529 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.550 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.555 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.558 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:31:08 compute-0 systemd-machined[154006]: New machine qemu-34-instance-00000057.
Jan 22 22:31:08 compute-0 systemd-udevd[222369]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.575 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d49601-b35c-4a56-b41a-3942f1b81626]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.576 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.580 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.580 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[242f8868-91ac-41ad-928e-1ccf40bbbb90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.581 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94dce340-f7fc-4f83-b637-011220fb57ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 NetworkManager[54954]: <info>  [1769121068.5929] device (tapb56a4401-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:31:08 compute-0 NetworkManager[54954]: <info>  [1769121068.5939] device (tapb56a4401-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.598 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[c71c462e-aa71-40e3-a2f8-6216326d072b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-00000057.
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.618 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:08 compute-0 ovn_controller[94850]: 2026-01-22T22:31:08Z|00273|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f ovn-installed in OVS
Jan 22 22:31:08 compute-0 ovn_controller[94850]: 2026-01-22T22:31:08Z|00274|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f up in Southbound
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.624 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.630 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2e05bf57-d96b-4507-93d9-3b874ea877a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.665 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd020dd-7e3c-4060-99f6-511b61a331c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.671 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1b1d42-5450-4523-a4e7-3aec205ac13a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 NetworkManager[54954]: <info>  [1769121068.6733] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/137)
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.707 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b59985a1-6759-47bd-b9a9-91fe3856dd31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.710 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[5d41725c-8e42-44af-9282-7c88befda2c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 NetworkManager[54954]: <info>  [1769121068.7348] device (tape65877e5-00): carrier: link connected
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.744 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7714c2fc-cfb9-4d59-bfd7-62c35981206e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.769 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ec6f38-40dd-4bcb-8963-99503a506713]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470633, 'reachable_time': 24525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222401, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.793 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[01dd562a-5eb7-40b4-9d25-0bd10a148d09]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 470633, 'tstamp': 470633}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222402, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.825 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[76a9c572-224e-479e-b156-796c349be1f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 3, 'rx_bytes': 180, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 3, 'rx_bytes': 180, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470633, 'reachable_time': 24525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 224, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 224, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222403, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.873 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bb63f72a-e9c3-43c5-9b93-950ea0865ab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.914 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.979 182729 DEBUG nova.compute.manager [req-bd483589-1bb9-4bf2-bd09-88223763adc3 req-453eae50-2d2f-49a2-8c53-306bbbb39f2d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.980 182729 DEBUG oslo_concurrency.lockutils [req-bd483589-1bb9-4bf2-bd09-88223763adc3 req-453eae50-2d2f-49a2-8c53-306bbbb39f2d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.980 182729 DEBUG oslo_concurrency.lockutils [req-bd483589-1bb9-4bf2-bd09-88223763adc3 req-453eae50-2d2f-49a2-8c53-306bbbb39f2d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.981 182729 DEBUG oslo_concurrency.lockutils [req-bd483589-1bb9-4bf2-bd09-88223763adc3 req-453eae50-2d2f-49a2-8c53-306bbbb39f2d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:08 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.981 182729 DEBUG nova.compute.manager [req-bd483589-1bb9-4bf2-bd09-88223763adc3 req-453eae50-2d2f-49a2-8c53-306bbbb39f2d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Processing event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.992 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c74769b8-0d80-4ae1-a9dd-5edbac73f43a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.994 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.995 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:08.996 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:08 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:31:09 compute-0 NetworkManager[54954]: <info>  [1769121068.9999] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:08.999 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:09.007 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:09 compute-0 ovn_controller[94850]: 2026-01-22T22:31:09Z|00275|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.009 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:09.014 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:09.015 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff9e1f3-aef8-4505-ab62-94a2b3f51ff3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:09.017 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:31:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:09.019 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.022 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.127 182729 DEBUG nova.network.neutron [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updated VIF entry in instance network info cache for port b56a4401-4c89-482a-a347-ca080a879f8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.128 182729 DEBUG nova.network.neutron [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.147 182729 DEBUG oslo_concurrency.lockutils [req-e77e1e4e-7955-4d32-85a4-464233924339 req-ffd2e041-7769-47ae-ba44-871e9075589f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.254 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121054.2538042, 5a9390a0-5077-46b6-8f6c-b3b308db8b1d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.255 182729 INFO nova.compute.manager [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] VM Stopped (Lifecycle Event)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.274 182729 DEBUG nova.compute.manager [None req-e268bb6d-6c70-471d-ac45-3c1939f63974 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.361 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121069.361413, 454ec87b-a45c-40af-8bce-d252eea19620 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.362 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Started (Lifecycle Event)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.364 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.370 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.375 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance spawned successfully.
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.375 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.383 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.389 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.400 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.400 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.401 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.402 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.402 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.402 182729 DEBUG nova.virt.libvirt.driver [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.408 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.408 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121069.3654492, 454ec87b-a45c-40af-8bce-d252eea19620 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.408 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Paused (Lifecycle Event)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.447 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.452 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121069.3693469, 454ec87b-a45c-40af-8bce-d252eea19620 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.453 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Resumed (Lifecycle Event)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.471 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.475 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.488 182729 INFO nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Took 4.83 seconds to spawn the instance on the hypervisor.
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.488 182729 DEBUG nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.495 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:31:09 compute-0 podman[222442]: 2026-01-22 22:31:09.509925149 +0000 UTC m=+0.056260352 container create 0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:31:09 compute-0 systemd[1]: Started libpod-conmon-0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4.scope.
Jan 22 22:31:09 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:31:09 compute-0 podman[222442]: 2026-01-22 22:31:09.481481216 +0000 UTC m=+0.027816449 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:31:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb9ca6c017f772f08180b18a73b89757879d60369c0db13415ae21aa2b390ef9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.594 182729 INFO nova.compute.manager [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Took 5.51 seconds to build instance.
Jan 22 22:31:09 compute-0 podman[222442]: 2026-01-22 22:31:09.597441843 +0000 UTC m=+0.143777086 container init 0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 22:31:09 compute-0 podman[222442]: 2026-01-22 22:31:09.60866996 +0000 UTC m=+0.155005183 container start 0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 22:31:09 compute-0 nova_compute[182725]: 2026-01-22 22:31:09.630 182729 DEBUG oslo_concurrency.lockutils [None req-189a0bb5-3d73-4aa7-bc78-f396e293be5d 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222458]: [NOTICE]   (222462) : New worker (222464) forked
Jan 22 22:31:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222458]: [NOTICE]   (222462) : Loading success.
Jan 22 22:31:11 compute-0 nova_compute[182725]: 2026-01-22 22:31:11.092 182729 DEBUG nova.compute.manager [req-d07d4f7c-eb85-48a0-bcc7-bf9595db523d req-0cad9fc5-58d5-4df9-ba9e-7f6d798b3cff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:11 compute-0 nova_compute[182725]: 2026-01-22 22:31:11.092 182729 DEBUG oslo_concurrency.lockutils [req-d07d4f7c-eb85-48a0-bcc7-bf9595db523d req-0cad9fc5-58d5-4df9-ba9e-7f6d798b3cff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:11 compute-0 nova_compute[182725]: 2026-01-22 22:31:11.092 182729 DEBUG oslo_concurrency.lockutils [req-d07d4f7c-eb85-48a0-bcc7-bf9595db523d req-0cad9fc5-58d5-4df9-ba9e-7f6d798b3cff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:11 compute-0 nova_compute[182725]: 2026-01-22 22:31:11.093 182729 DEBUG oslo_concurrency.lockutils [req-d07d4f7c-eb85-48a0-bcc7-bf9595db523d req-0cad9fc5-58d5-4df9-ba9e-7f6d798b3cff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:11 compute-0 nova_compute[182725]: 2026-01-22 22:31:11.093 182729 DEBUG nova.compute.manager [req-d07d4f7c-eb85-48a0-bcc7-bf9595db523d req-0cad9fc5-58d5-4df9-ba9e-7f6d798b3cff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:11 compute-0 nova_compute[182725]: 2026-01-22 22:31:11.093 182729 WARNING nova.compute.manager [req-d07d4f7c-eb85-48a0-bcc7-bf9595db523d req-0cad9fc5-58d5-4df9-ba9e-7f6d798b3cff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:31:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:12.438 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:12.439 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:12.441 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:12 compute-0 NetworkManager[54954]: <info>  [1769121072.4669] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Jan 22 22:31:12 compute-0 NetworkManager[54954]: <info>  [1769121072.4687] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Jan 22 22:31:12 compute-0 ovn_controller[94850]: 2026-01-22T22:31:12Z|00276|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.465 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:12 compute-0 ovn_controller[94850]: 2026-01-22T22:31:12Z|00277|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.495 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.506 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.704 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.907 182729 DEBUG nova.compute.manager [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-changed-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.907 182729 DEBUG nova.compute.manager [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Refreshing instance network info cache due to event network-changed-b56a4401-4c89-482a-a347-ca080a879f8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.908 182729 DEBUG oslo_concurrency.lockutils [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.908 182729 DEBUG oslo_concurrency.lockutils [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:12 compute-0 nova_compute[182725]: 2026-01-22 22:31:12.908 182729 DEBUG nova.network.neutron [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Refreshing network info cache for port b56a4401-4c89-482a-a347-ca080a879f8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:31:13 compute-0 nova_compute[182725]: 2026-01-22 22:31:13.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:13 compute-0 nova_compute[182725]: 2026-01-22 22:31:13.917 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:14 compute-0 podman[222474]: 2026-01-22 22:31:14.173228705 +0000 UTC m=+0.093484532 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.684 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.779 182729 DEBUG nova.network.neutron [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updated VIF entry in instance network info cache for port b56a4401-4c89-482a-a347-ca080a879f8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.780 182729 DEBUG nova.network.neutron [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.811 182729 DEBUG oslo_concurrency.lockutils [req-18ffc153-f713-47af-a90f-984805d0d55e req-edee804c-f8d2-4034-a71d-152282a844a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.934 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.935 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.936 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:14 compute-0 nova_compute[182725]: 2026-01-22 22:31:14.937 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.021 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.099 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.101 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.193 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.377 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.378 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5517MB free_disk=73.36766052246094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.378 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.379 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.465 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 454ec87b-a45c-40af-8bce-d252eea19620 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.466 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.466 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.531 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.548 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.567 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:31:15 compute-0 nova_compute[182725]: 2026-01-22 22:31:15.568 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:16 compute-0 nova_compute[182725]: 2026-01-22 22:31:16.944 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:17 compute-0 podman[222504]: 2026-01-22 22:31:17.142853175 +0000 UTC m=+0.067238683 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 22:31:17 compute-0 podman[222503]: 2026-01-22 22:31:17.174876197 +0000 UTC m=+0.101774137 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:31:17 compute-0 nova_compute[182725]: 2026-01-22 22:31:17.708 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.320 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.322 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.343 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.417 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.418 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.426 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.426 182729 INFO nova.compute.claims [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.561 182729 DEBUG nova.compute.provider_tree [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.568 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.577 182729 DEBUG nova.scheduler.client.report [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.609 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.611 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.661 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.662 182729 DEBUG nova.network.neutron [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.680 182729 INFO nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.697 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.817 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.820 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.821 182729 INFO nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Creating image(s)
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.822 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.823 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.824 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.851 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.920 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.944 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.947 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.948 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:18 compute-0 nova_compute[182725]: 2026-01-22 22:31:18.971 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.075 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.077 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.134 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.136 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.137 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.215 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.218 182729 DEBUG nova.virt.disk.api [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Checking if we can resize image /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.218 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.282 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.285 182729 DEBUG nova.virt.disk.api [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Cannot resize image /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.285 182729 DEBUG nova.objects.instance [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'migration_context' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.304 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.305 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Ensure instance console log exists: /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.306 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.307 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.308 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.362 182729 DEBUG nova.policy [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:31:19 compute-0 nova_compute[182725]: 2026-01-22 22:31:19.929 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:31:21 compute-0 nova_compute[182725]: 2026-01-22 22:31:21.730 182729 DEBUG nova.network.neutron [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Successfully created port: 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:31:22 compute-0 ovn_controller[94850]: 2026-01-22T22:31:22Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:31:22 compute-0 ovn_controller[94850]: 2026-01-22T22:31:22Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.713 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.815 182729 DEBUG nova.network.neutron [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Successfully updated port: 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.852 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.853 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquired lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.854 182729 DEBUG nova.network.neutron [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.973 182729 DEBUG nova.compute.manager [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-changed-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.974 182729 DEBUG nova.compute.manager [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Refreshing instance network info cache due to event network-changed-8e8cfdc3-60bc-4edf-89ba-c53573ea3141. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:31:22 compute-0 nova_compute[182725]: 2026-01-22 22:31:22.974 182729 DEBUG oslo_concurrency.lockutils [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:23 compute-0 nova_compute[182725]: 2026-01-22 22:31:23.075 182729 DEBUG nova.network.neutron [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:31:23 compute-0 nova_compute[182725]: 2026-01-22 22:31:23.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:23 compute-0 nova_compute[182725]: 2026-01-22 22:31:23.923 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.031 182729 DEBUG nova.network.neutron [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Updating instance_info_cache with network_info: [{"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.086 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Releasing lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.087 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance network_info: |[{"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.087 182729 DEBUG oslo_concurrency.lockutils [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.088 182729 DEBUG nova.network.neutron [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Refreshing network info cache for port 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.092 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Start _get_guest_xml network_info=[{"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.099 182729 WARNING nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.104 182729 DEBUG nova.virt.libvirt.host [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.104 182729 DEBUG nova.virt.libvirt.host [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.108 182729 DEBUG nova.virt.libvirt.host [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.109 182729 DEBUG nova.virt.libvirt.host [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.111 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.111 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.112 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.112 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.112 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.113 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.113 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.113 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.114 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.114 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.114 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.114 182729 DEBUG nova.virt.hardware [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.119 182729 DEBUG nova.virt.libvirt.vif [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-871599483',display_name='tempest-ListServerFiltersTestJSON-instance-871599483',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-871599483',id=88,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-34audv8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:18Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=815ebbb8-e2c4-4f72-8048-df7c53f1439a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.120 182729 DEBUG nova.network.os_vif_util [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.121 182729 DEBUG nova.network.os_vif_util [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.122 182729 DEBUG nova.objects.instance [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'pci_devices' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.137 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <uuid>815ebbb8-e2c4-4f72-8048-df7c53f1439a</uuid>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <name>instance-00000058</name>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-871599483</nova:name>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:31:25</nova:creationTime>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:user uuid="b6f50d0e6a7444f0ac9c928363915afb">tempest-ListServerFiltersTestJSON-1169398826-project-member</nova:user>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:project uuid="802c49a328ca49e3a4ea4e46b9a9f5eb">tempest-ListServerFiltersTestJSON-1169398826</nova:project>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         <nova:port uuid="8e8cfdc3-60bc-4edf-89ba-c53573ea3141">
Jan 22 22:31:25 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <system>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <entry name="serial">815ebbb8-e2c4-4f72-8048-df7c53f1439a</entry>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <entry name="uuid">815ebbb8-e2c4-4f72-8048-df7c53f1439a</entry>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </system>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <os>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   </os>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <features>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   </features>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.config"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:b3:60:70"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <target dev="tap8e8cfdc3-60"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/console.log" append="off"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <video>
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </video>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:31:25 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:31:25 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:31:25 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:31:25 compute-0 nova_compute[182725]: </domain>
Jan 22 22:31:25 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.138 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Preparing to wait for external event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.139 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.139 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.140 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.141 182729 DEBUG nova.virt.libvirt.vif [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-871599483',display_name='tempest-ListServerFiltersTestJSON-instance-871599483',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-871599483',id=88,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-34audv8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:18Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=815ebbb8-e2c4-4f72-8048-df7c53f1439a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.141 182729 DEBUG nova.network.os_vif_util [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.142 182729 DEBUG nova.network.os_vif_util [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.143 182729 DEBUG os_vif [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.144 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.145 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.145 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.150 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.150 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e8cfdc3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.151 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8e8cfdc3-60, col_values=(('external_ids', {'iface-id': '8e8cfdc3-60bc-4edf-89ba-c53573ea3141', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:60:70', 'vm-uuid': '815ebbb8-e2c4-4f72-8048-df7c53f1439a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.153 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:25 compute-0 NetworkManager[54954]: <info>  [1769121085.1551] manager: (tap8e8cfdc3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.156 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.162 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.163 182729 INFO os_vif [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60')
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.230 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.231 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.231 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] No VIF found with MAC fa:16:3e:b3:60:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.232 182729 INFO nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Using config drive
Jan 22 22:31:25 compute-0 nova_compute[182725]: 2026-01-22 22:31:25.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.171 182729 INFO nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Creating config drive at /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.config
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.177 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrhw5zqn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.307 182729 DEBUG oslo_concurrency.processutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrhw5zqn" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:26 compute-0 kernel: tap8e8cfdc3-60: entered promiscuous mode
Jan 22 22:31:26 compute-0 NetworkManager[54954]: <info>  [1769121086.3805] manager: (tap8e8cfdc3-60): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Jan 22 22:31:26 compute-0 ovn_controller[94850]: 2026-01-22T22:31:26Z|00278|binding|INFO|Claiming lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for this chassis.
Jan 22 22:31:26 compute-0 ovn_controller[94850]: 2026-01-22T22:31:26Z|00279|binding|INFO|8e8cfdc3-60bc-4edf-89ba-c53573ea3141: Claiming fa:16:3e:b3:60:70 10.100.0.14
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.381 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.383 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.394 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:60:70 10.100.0.14'], port_security=['fa:16:3e:b3:60:70 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f234f62b-5371-4527-94e7-91cf5da3055e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3b57348-3994-471b-bd73-e78507392f5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2c3e5cc-ee0d-48e7-8eab-3e968c7ed6fc, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=8e8cfdc3-60bc-4edf-89ba-c53573ea3141) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.395 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 in datapath f234f62b-5371-4527-94e7-91cf5da3055e bound to our chassis
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.397 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.409 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3bc722-4504-4539-8ae5-57eb915f1fb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.410 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf234f62b-51 in ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.412 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf234f62b-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.413 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f2cec0ef-bd95-4add-ab94-c4c012b833ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 systemd-udevd[222598]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.413 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e006c93f-1854-446c-83bc-c6c88408ee95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_controller[94850]: 2026-01-22T22:31:26Z|00280|binding|INFO|Setting lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 ovn-installed in OVS
Jan 22 22:31:26 compute-0 ovn_controller[94850]: 2026-01-22T22:31:26Z|00281|binding|INFO|Setting lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 up in Southbound
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.419 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.420 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.428 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[38e99832-2ab6-4056-ace8-a31a0b3b8314]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 NetworkManager[54954]: <info>  [1769121086.4311] device (tap8e8cfdc3-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:31:26 compute-0 NetworkManager[54954]: <info>  [1769121086.4321] device (tap8e8cfdc3-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:31:26 compute-0 systemd-machined[154006]: New machine qemu-35-instance-00000058.
Jan 22 22:31:26 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-00000058.
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.455 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[41d62a25-63ba-4e7a-9097-727d71f74cf0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.484 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a32d03-5ede-402a-92c6-2ffccbae51c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.490 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a41838-4c2f-4d03-a2de-8bcc48b8fafd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 systemd-udevd[222602]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:31:26 compute-0 NetworkManager[54954]: <info>  [1769121086.4914] manager: (tapf234f62b-50): new Veth device (/org/freedesktop/NetworkManager/Devices/143)
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.527 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d817f5ac-779d-4d1a-a28d-68c234f38065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.530 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[513b5fe5-744b-4a16-8652-e54faf78ac26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 NetworkManager[54954]: <info>  [1769121086.5518] device (tapf234f62b-50): carrier: link connected
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.555 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[342ab597-2923-4dd7-bc91-fea5f59238e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.575 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e7fd02-ca78-4fa2-8253-7a3f91a91f77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf234f62b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:3d:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472415, 'reachable_time': 38366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222631, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.594 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6e920a7c-0b86-4bde-9ba7-585b28ceac26]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:3df6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 472415, 'tstamp': 472415}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222637, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.609 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d611a12a-80c0-4300-a864-2ac556d2a07c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf234f62b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:3d:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472415, 'reachable_time': 38366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222639, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.647 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[543e2bb2-a216-4dab-9423-e8f3ebea96a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.699 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121086.698351, 815ebbb8-e2c4-4f72-8048-df7c53f1439a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.700 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] VM Started (Lifecycle Event)
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.714 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5b719e-4397-4b9c-8221-74207a94d288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.716 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf234f62b-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.717 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.717 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf234f62b-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.719 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 NetworkManager[54954]: <info>  [1769121086.7203] manager: (tapf234f62b-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 22 22:31:26 compute-0 kernel: tapf234f62b-50: entered promiscuous mode
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.721 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.723 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf234f62b-50, col_values=(('external_ids', {'iface-id': '0a1fd4a8-b506-4c9d-9846-1c0ab542e465'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:26 compute-0 ovn_controller[94850]: 2026-01-22T22:31:26Z|00282|binding|INFO|Releasing lport 0a1fd4a8-b506-4c9d-9846-1c0ab542e465 from this chassis (sb_readonly=0)
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.724 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.729 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.733 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121086.6986663, 815ebbb8-e2c4-4f72-8048-df7c53f1439a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.733 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] VM Paused (Lifecycle Event)
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.735 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.737 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.738 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d6b3ea-ec1d-4a51-8d8d-5ccdeb919e3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.739 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:31:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:26.739 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'env', 'PROCESS_TAG=haproxy-f234f62b-5371-4527-94e7-91cf5da3055e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f234f62b-5371-4527-94e7-91cf5da3055e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.752 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.756 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.777 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.856 182729 DEBUG nova.compute.manager [req-3b594d6e-7dc1-4ac4-b965-0b3d9d11afe1 req-71db48bf-5472-48ca-a7e2-70931ee8a064 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.857 182729 DEBUG oslo_concurrency.lockutils [req-3b594d6e-7dc1-4ac4-b965-0b3d9d11afe1 req-71db48bf-5472-48ca-a7e2-70931ee8a064 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.857 182729 DEBUG oslo_concurrency.lockutils [req-3b594d6e-7dc1-4ac4-b965-0b3d9d11afe1 req-71db48bf-5472-48ca-a7e2-70931ee8a064 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.857 182729 DEBUG oslo_concurrency.lockutils [req-3b594d6e-7dc1-4ac4-b965-0b3d9d11afe1 req-71db48bf-5472-48ca-a7e2-70931ee8a064 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.857 182729 DEBUG nova.compute.manager [req-3b594d6e-7dc1-4ac4-b965-0b3d9d11afe1 req-71db48bf-5472-48ca-a7e2-70931ee8a064 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Processing event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.858 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.861 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121086.8612704, 815ebbb8-e2c4-4f72-8048-df7c53f1439a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.861 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] VM Resumed (Lifecycle Event)
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.867 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.870 182729 INFO nova.virt.libvirt.driver [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance spawned successfully.
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.870 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.900 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.904 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.904 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.904 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.905 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.905 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.905 182729 DEBUG nova.virt.libvirt.driver [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.909 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:26 compute-0 nova_compute[182725]: 2026-01-22 22:31:26.987 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.006 182729 DEBUG nova.network.neutron [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Updated VIF entry in instance network info cache for port 8e8cfdc3-60bc-4edf-89ba-c53573ea3141. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.006 182729 DEBUG nova.network.neutron [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Updating instance_info_cache with network_info: [{"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.025 182729 INFO nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Took 8.21 seconds to spawn the instance on the hypervisor.
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.025 182729 DEBUG nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.031 182729 DEBUG oslo_concurrency.lockutils [req-8b58194d-0ccb-4cbb-93db-b2372c34c7ad req-49b4fc82-8529-48ea-a5af-ce6623f52e18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.131 182729 INFO nova.compute.manager [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Took 8.74 seconds to build instance.
Jan 22 22:31:27 compute-0 podman[222670]: 2026-01-22 22:31:27.144558946 +0000 UTC m=+0.057016021 container create 60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.170 182729 DEBUG oslo_concurrency.lockutils [None req-578c794f-58f6-47f1-adc9-f69694e00151 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:27 compute-0 systemd[1]: Started libpod-conmon-60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d.scope.
Jan 22 22:31:27 compute-0 podman[222670]: 2026-01-22 22:31:27.113467837 +0000 UTC m=+0.025924952 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:31:27 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:31:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/414d0b7f4032cc48121e00a1957d860e562adde16c3e0011d509534283378c86/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:31:27 compute-0 podman[222670]: 2026-01-22 22:31:27.239667047 +0000 UTC m=+0.152124142 container init 60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 22:31:27 compute-0 podman[222670]: 2026-01-22 22:31:27.244999649 +0000 UTC m=+0.157456724 container start 60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 22:31:27 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[222685]: [NOTICE]   (222709) : New worker (222725) forked
Jan 22 22:31:27 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[222685]: [NOTICE]   (222709) : Loading success.
Jan 22 22:31:27 compute-0 podman[222687]: 2026-01-22 22:31:27.281777998 +0000 UTC m=+0.060272221 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:31:27 compute-0 podman[222689]: 2026-01-22 22:31:27.286912235 +0000 UTC m=+0.064681760 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:31:27 compute-0 nova_compute[182725]: 2026-01-22 22:31:27.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:31:28 compute-0 nova_compute[182725]: 2026-01-22 22:31:28.926 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:28 compute-0 nova_compute[182725]: 2026-01-22 22:31:28.952 182729 DEBUG nova.compute.manager [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:28 compute-0 nova_compute[182725]: 2026-01-22 22:31:28.953 182729 DEBUG oslo_concurrency.lockutils [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:28 compute-0 nova_compute[182725]: 2026-01-22 22:31:28.953 182729 DEBUG oslo_concurrency.lockutils [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:28 compute-0 nova_compute[182725]: 2026-01-22 22:31:28.953 182729 DEBUG oslo_concurrency.lockutils [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:28 compute-0 nova_compute[182725]: 2026-01-22 22:31:28.953 182729 DEBUG nova.compute.manager [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] No waiting events found dispatching network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:28 compute-0 nova_compute[182725]: 2026-01-22 22:31:28.953 182729 WARNING nova.compute.manager [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received unexpected event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for instance with vm_state active and task_state None.
Jan 22 22:31:30 compute-0 nova_compute[182725]: 2026-01-22 22:31:30.155 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:32 compute-0 podman[222743]: 2026-01-22 22:31:32.12855691 +0000 UTC m=+0.055081333 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:31:33 compute-0 nova_compute[182725]: 2026-01-22 22:31:33.928 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:34 compute-0 nova_compute[182725]: 2026-01-22 22:31:34.279 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:34 compute-0 nova_compute[182725]: 2026-01-22 22:31:34.280 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:34 compute-0 nova_compute[182725]: 2026-01-22 22:31:34.280 182729 INFO nova.compute.manager [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Rebooting instance
Jan 22 22:31:34 compute-0 nova_compute[182725]: 2026-01-22 22:31:34.306 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:34 compute-0 nova_compute[182725]: 2026-01-22 22:31:34.307 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:34 compute-0 nova_compute[182725]: 2026-01-22 22:31:34.308 182729 DEBUG nova.network.neutron [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:31:35 compute-0 nova_compute[182725]: 2026-01-22 22:31:35.157 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:38 compute-0 nova_compute[182725]: 2026-01-22 22:31:38.930 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:39 compute-0 ovn_controller[94850]: 2026-01-22T22:31:39Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b3:60:70 10.100.0.14
Jan 22 22:31:39 compute-0 ovn_controller[94850]: 2026-01-22T22:31:39Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:60:70 10.100.0.14
Jan 22 22:31:39 compute-0 nova_compute[182725]: 2026-01-22 22:31:39.675 182729 DEBUG nova.network.neutron [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:39 compute-0 nova_compute[182725]: 2026-01-22 22:31:39.702 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:39 compute-0 nova_compute[182725]: 2026-01-22 22:31:39.732 182729 DEBUG nova.compute.manager [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:40 compute-0 kernel: tapb56a4401-4c (unregistering): left promiscuous mode
Jan 22 22:31:40 compute-0 NetworkManager[54954]: <info>  [1769121100.0208] device (tapb56a4401-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:31:40 compute-0 ovn_controller[94850]: 2026-01-22T22:31:40Z|00283|binding|INFO|Releasing lport b56a4401-4c89-482a-a347-ca080a879f8f from this chassis (sb_readonly=0)
Jan 22 22:31:40 compute-0 ovn_controller[94850]: 2026-01-22T22:31:40Z|00284|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f down in Southbound
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.032 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 ovn_controller[94850]: 2026-01-22T22:31:40Z|00285|binding|INFO|Removing iface tapb56a4401-4c ovn-installed in OVS
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.035 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.049 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.058 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.060 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.061 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.062 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebcb443-521e-47b0-9a8c-81e7b8ee39f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.063 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:31:40 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 22 22:31:40 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000057.scope: Consumed 14.153s CPU time.
Jan 22 22:31:40 compute-0 systemd-machined[154006]: Machine qemu-34-instance-00000057 terminated.
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.160 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222458]: [NOTICE]   (222462) : haproxy version is 2.8.14-c23fe91
Jan 22 22:31:40 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222458]: [NOTICE]   (222462) : path to executable is /usr/sbin/haproxy
Jan 22 22:31:40 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222458]: [WARNING]  (222462) : Exiting Master process...
Jan 22 22:31:40 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222458]: [ALERT]    (222462) : Current worker (222464) exited with code 143 (Terminated)
Jan 22 22:31:40 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222458]: [WARNING]  (222462) : All workers exited. Exiting... (0)
Jan 22 22:31:40 compute-0 systemd[1]: libpod-0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4.scope: Deactivated successfully.
Jan 22 22:31:40 compute-0 podman[222812]: 2026-01-22 22:31:40.228217682 +0000 UTC m=+0.054722994 container died 0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:31:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4-userdata-shm.mount: Deactivated successfully.
Jan 22 22:31:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-eb9ca6c017f772f08180b18a73b89757879d60369c0db13415ae21aa2b390ef9-merged.mount: Deactivated successfully.
Jan 22 22:31:40 compute-0 podman[222812]: 2026-01-22 22:31:40.27385679 +0000 UTC m=+0.100362112 container cleanup 0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:31:40 compute-0 systemd[1]: libpod-conmon-0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4.scope: Deactivated successfully.
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.288 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance destroyed successfully.
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.289 182729 DEBUG nova.objects.instance [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.303 182729 DEBUG nova.virt.libvirt.vif [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.304 182729 DEBUG nova.network.os_vif_util [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.305 182729 DEBUG nova.network.os_vif_util [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.305 182729 DEBUG os_vif [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.306 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.307 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb56a4401-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.309 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.312 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.315 182729 INFO os_vif [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.325 182729 DEBUG nova.virt.libvirt.driver [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Start _get_guest_xml network_info=[{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.330 182729 WARNING nova.virt.libvirt.driver [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.340 182729 DEBUG nova.virt.libvirt.host [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.341 182729 DEBUG nova.virt.libvirt.host [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.344 182729 DEBUG nova.virt.libvirt.host [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.346 182729 DEBUG nova.virt.libvirt.host [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.348 182729 DEBUG nova.virt.libvirt.driver [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.349 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.350 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.350 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.351 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.351 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.352 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.352 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.353 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.353 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:31:40 compute-0 podman[222854]: 2026-01-22 22:31:40.353809726 +0000 UTC m=+0.049637828 container remove 0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.354 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.354 182729 DEBUG nova.virt.hardware [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.354 182729 DEBUG nova.objects.instance [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'vcpu_model' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.360 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1e07c84d-8353-403d-980e-b96c27415be6]: (4, ('Thu Jan 22 10:31:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4)\n0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4\nThu Jan 22 10:31:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4)\n0ba52a0e3571251e20890be8fb9e31e8bd63975150b118699d2620275ff8e6a4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.362 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b101e050-6e19-40d7-aa29-a934664c861d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.364 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:40 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.366 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.372 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[20c1d4a6-c76f-425c-b4ca-37aea0c8ee6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.382 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.386 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[75f0d826-439d-4778-b89c-66f566a0fe1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.388 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9c16792b-f605-4c76-9f2c-be90773cf357]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.407 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.407 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e51df3cd-df6e-412f-8fcd-421066fbb2c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470626, 'reachable_time': 31695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222871, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.412 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.412 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a7018a5e-12de-4732-b98d-761f4f50143a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.448 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.449 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.449 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.450 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.451 182729 DEBUG nova.virt.libvirt.vif [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.451 182729 DEBUG nova.network.os_vif_util [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.452 182729 DEBUG nova.network.os_vif_util [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.453 182729 DEBUG nova.objects.instance [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.469 182729 DEBUG nova.virt.libvirt.driver [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <uuid>454ec87b-a45c-40af-8bce-d252eea19620</uuid>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <name>instance-00000057</name>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-256562799</nova:name>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:31:40</nova:creationTime>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         <nova:port uuid="b56a4401-4c89-482a-a347-ca080a879f8f">
Jan 22 22:31:40 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <system>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <entry name="serial">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <entry name="uuid">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </system>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <os>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   </os>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <features>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   </features>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:f0:b5:89"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <target dev="tapb56a4401-4c"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/console.log" append="off"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <video>
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </video>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:31:40 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:31:40 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:31:40 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:31:40 compute-0 nova_compute[182725]: </domain>
Jan 22 22:31:40 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.471 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.543 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.545 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.622 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.624 182729 DEBUG nova.objects.instance [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'trusted_certs' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.646 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.705 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.706 182729 DEBUG nova.virt.disk.api [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.707 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.764 182729 DEBUG oslo_concurrency.processutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.765 182729 DEBUG nova.virt.disk.api [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.766 182729 DEBUG nova.objects.instance [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.780 182729 DEBUG nova.virt.libvirt.vif [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.781 182729 DEBUG nova.network.os_vif_util [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.782 182729 DEBUG nova.network.os_vif_util [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.783 182729 DEBUG os_vif [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.783 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.784 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.784 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.787 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.787 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb56a4401-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.788 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb56a4401-4c, col_values=(('external_ids', {'iface-id': 'b56a4401-4c89-482a-a347-ca080a879f8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:b5:89', 'vm-uuid': '454ec87b-a45c-40af-8bce-d252eea19620'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.790 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 NetworkManager[54954]: <info>  [1769121100.7917] manager: (tapb56a4401-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.797 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.798 182729 INFO os_vif [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:31:40 compute-0 kernel: tapb56a4401-4c: entered promiscuous mode
Jan 22 22:31:40 compute-0 systemd-udevd[222790]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:31:40 compute-0 NetworkManager[54954]: <info>  [1769121100.8863] manager: (tapb56a4401-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.885 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 ovn_controller[94850]: 2026-01-22T22:31:40Z|00286|binding|INFO|Claiming lport b56a4401-4c89-482a-a347-ca080a879f8f for this chassis.
Jan 22 22:31:40 compute-0 ovn_controller[94850]: 2026-01-22T22:31:40Z|00287|binding|INFO|b56a4401-4c89-482a-a347-ca080a879f8f: Claiming fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.888 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 NetworkManager[54954]: <info>  [1769121100.8972] device (tapb56a4401-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:31:40 compute-0 NetworkManager[54954]: <info>  [1769121100.8979] device (tapb56a4401-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.900 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.901 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.902 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.916 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1d74862d-90be-4d19-ba96-07e9f3c69ee1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.917 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:31:40 compute-0 ovn_controller[94850]: 2026-01-22T22:31:40Z|00288|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f ovn-installed in OVS
Jan 22 22:31:40 compute-0 ovn_controller[94850]: 2026-01-22T22:31:40Z|00289|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f up in Southbound
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.919 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.919 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5098b9f4-54f4-424c-9343-f11775d6cc0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 nova_compute[182725]: 2026-01-22 22:31:40.920 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.920 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5deb520-7fa5-4330-a20d-071b0c4d3a1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.935 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee0e985-4287-40de-9568-76bb7cf8cac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 systemd-machined[154006]: New machine qemu-36-instance-00000057.
Jan 22 22:31:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.958 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae94ea9-c157-4c2d-a879-89f9ab3f4883]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:40 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-00000057.
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:40.999 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1148e4cb-abb8-4aa9-b84c-acd86797b370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.006 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc64d66-46ca-4434-96db-74dfbc08202e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 NetworkManager[54954]: <info>  [1769121101.0077] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.029 182729 DEBUG nova.compute.manager [req-231541de-b678-409f-9485-ddf653fbc6b9 req-1c2c191e-f0c7-4ede-8852-bd8d57bdf110 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.031 182729 DEBUG oslo_concurrency.lockutils [req-231541de-b678-409f-9485-ddf653fbc6b9 req-1c2c191e-f0c7-4ede-8852-bd8d57bdf110 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.033 182729 DEBUG oslo_concurrency.lockutils [req-231541de-b678-409f-9485-ddf653fbc6b9 req-1c2c191e-f0c7-4ede-8852-bd8d57bdf110 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.033 182729 DEBUG oslo_concurrency.lockutils [req-231541de-b678-409f-9485-ddf653fbc6b9 req-1c2c191e-f0c7-4ede-8852-bd8d57bdf110 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.034 182729 DEBUG nova.compute.manager [req-231541de-b678-409f-9485-ddf653fbc6b9 req-1c2c191e-f0c7-4ede-8852-bd8d57bdf110 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.035 182729 WARNING nova.compute.manager [req-231541de-b678-409f-9485-ddf653fbc6b9 req-1c2c191e-f0c7-4ede-8852-bd8d57bdf110 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state reboot_started_hard.
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.062 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[50c9d74f-8b3c-4178-be93-ffaa19506ee0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.070 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[48e764bf-c97d-4b91-a73c-1d6726158679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 NetworkManager[54954]: <info>  [1769121101.1026] device (tape65877e5-00): carrier: link connected
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.107 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b811fb-144d-4b52-a7b3-ec1d80e91b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.126 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[10be86aa-aca2-4dad-9cfd-26ebd3945c69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473870, 'reachable_time': 36269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222932, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.141 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[66eb9852-cd61-4ed2-90f9-163a3dadfb98]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473870, 'tstamp': 473870}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222933, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.161 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d58eca94-6824-43c4-8e48-56eb130b0208]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473870, 'reachable_time': 36269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222934, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.192 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[22a1c6c3-1245-4d87-84ed-584c7f984fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.256 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1bdb4ee5-2746-4816-b4e9-08c918f641af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.258 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.258 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.259 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:41 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.262 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.265 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:41 compute-0 ovn_controller[94850]: 2026-01-22T22:31:41Z|00290|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.266 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:41.268 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:31:41 compute-0 nova_compute[182725]: 2026-01-22 22:31:41.278 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:42 compute-0 NetworkManager[54954]: <info>  [1769121102.0201] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:42.095 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1f49327a-9bac-43c7-8512-12226939bbbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:42.099 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:31:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:42.100 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.238 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 454ec87b-a45c-40af-8bce-d252eea19620 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.239 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121102.237661, 454ec87b-a45c-40af-8bce-d252eea19620 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.240 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Resumed (Lifecycle Event)
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.242 182729 DEBUG nova.compute.manager [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.247 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance rebooted successfully.
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.247 182729 DEBUG nova.compute.manager [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.258 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.262 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.296 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.297 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121102.2389295, 454ec87b-a45c-40af-8bce-d252eea19620 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.297 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Started (Lifecycle Event)
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.332 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.339 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:42 compute-0 nova_compute[182725]: 2026-01-22 22:31:42.347 182729 DEBUG oslo_concurrency.lockutils [None req-07e60857-0c0f-4eb3-98ea-3447323cf97c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 8.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:42 compute-0 podman[222973]: 2026-01-22 22:31:42.515056191 +0000 UTC m=+0.049759821 container create ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:31:42 compute-0 systemd[1]: Started libpod-conmon-ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079.scope.
Jan 22 22:31:42 compute-0 podman[222973]: 2026-01-22 22:31:42.48871504 +0000 UTC m=+0.023418690 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:31:42 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:31:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a1b4bb1be6b7574cf5f9bf1d4a6b3ceb9d4591a90287c9786d99a030e11a628/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:31:42 compute-0 podman[222973]: 2026-01-22 22:31:42.627938852 +0000 UTC m=+0.162642502 container init ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:31:42 compute-0 podman[222973]: 2026-01-22 22:31:42.633314875 +0000 UTC m=+0.168018505 container start ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:31:42 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222988]: [NOTICE]   (222992) : New worker (222994) forked
Jan 22 22:31:42 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222988]: [NOTICE]   (222992) : Loading success.
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.846 182729 DEBUG nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.847 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.847 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.847 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.848 182729 DEBUG nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.848 182729 WARNING nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.848 182729 DEBUG nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.849 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.849 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.849 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.849 182729 DEBUG nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.850 182729 WARNING nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.850 182729 DEBUG nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.850 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.850 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.851 182729 DEBUG oslo_concurrency.lockutils [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.851 182729 DEBUG nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.851 182729 WARNING nova.compute.manager [req-844fda6e-17a4-4e8f-ae04-b21da63cb72b req-3affa3df-9612-4c72-9a8e-2e6f4001a166 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:31:43 compute-0 nova_compute[182725]: 2026-01-22 22:31:43.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:44 compute-0 nova_compute[182725]: 2026-01-22 22:31:44.759 182729 INFO nova.compute.manager [None req-d22b2568-66d7-4463-a8ee-2a9a9e41fa11 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Get console output
Jan 22 22:31:44 compute-0 nova_compute[182725]: 2026-01-22 22:31:44.885 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:31:45 compute-0 podman[223004]: 2026-01-22 22:31:45.165279415 +0000 UTC m=+0.091425623 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:31:45 compute-0 nova_compute[182725]: 2026-01-22 22:31:45.791 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.060 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.061 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.082 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:31:48 compute-0 podman[223025]: 2026-01-22 22:31:48.158565498 +0000 UTC m=+0.072897947 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Jan 22 22:31:48 compute-0 podman[223024]: 2026-01-22 22:31:48.184719967 +0000 UTC m=+0.104405690 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.203 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.204 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.213 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.214 182729 INFO nova.compute.claims [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.423 182729 DEBUG nova.compute.provider_tree [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.440 182729 DEBUG nova.scheduler.client.report [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.469 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.470 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.531 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.532 182729 DEBUG nova.network.neutron [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.550 182729 INFO nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.573 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.703 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.705 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.706 182729 INFO nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Creating image(s)
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.707 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.708 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.709 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.730 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.808 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.809 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.809 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.820 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.868 182729 DEBUG nova.policy [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.881 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.882 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.922 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.923 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.924 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.941 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.982 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.983 182729 DEBUG nova.virt.disk.api [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:31:48 compute-0 nova_compute[182725]: 2026-01-22 22:31:48.983 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.045 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.047 182729 DEBUG nova.virt.disk.api [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.047 182729 DEBUG nova.objects.instance [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'migration_context' on Instance uuid 63595c63-b40f-4491-be23-cc90675eb94e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.066 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.067 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Ensure instance console log exists: /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.067 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.067 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.068 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.162 182729 DEBUG oslo_concurrency.lockutils [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.162 182729 DEBUG oslo_concurrency.lockutils [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.163 182729 DEBUG nova.compute.manager [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.167 182729 DEBUG nova.compute.manager [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.169 182729 DEBUG nova.objects.instance [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'flavor' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.192 182729 DEBUG nova.objects.instance [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'info_cache' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.232 182729 DEBUG nova.virt.libvirt.driver [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.412 182729 DEBUG oslo_concurrency.lockutils [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.413 182729 DEBUG oslo_concurrency.lockutils [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.413 182729 DEBUG nova.compute.manager [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.419 182729 DEBUG nova.compute.manager [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.421 182729 DEBUG nova.objects.instance [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'flavor' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.454 182729 DEBUG nova.objects.instance [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'info_cache' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.487 182729 DEBUG nova.virt.libvirt.driver [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:31:49 compute-0 nova_compute[182725]: 2026-01-22 22:31:49.691 182729 DEBUG nova.network.neutron [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Successfully created port: 108a1530-5cf0-483e-8cf2-e5ea0abc867d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.655 182729 DEBUG nova.network.neutron [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Successfully updated port: 108a1530-5cf0-483e-8cf2-e5ea0abc867d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.670 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-63595c63-b40f-4491-be23-cc90675eb94e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.670 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-63595c63-b40f-4491-be23-cc90675eb94e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.670 182729 DEBUG nova.network.neutron [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.793 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.818 182729 DEBUG nova.compute.manager [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received event network-changed-108a1530-5cf0-483e-8cf2-e5ea0abc867d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.818 182729 DEBUG nova.compute.manager [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Refreshing instance network info cache due to event network-changed-108a1530-5cf0-483e-8cf2-e5ea0abc867d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.818 182729 DEBUG oslo_concurrency.lockutils [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-63595c63-b40f-4491-be23-cc90675eb94e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:50 compute-0 nova_compute[182725]: 2026-01-22 22:31:50.849 182729 DEBUG nova.network.neutron [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:31:51 compute-0 kernel: tap8e8cfdc3-60 (unregistering): left promiscuous mode
Jan 22 22:31:51 compute-0 NetworkManager[54954]: <info>  [1769121111.4451] device (tap8e8cfdc3-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.445 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:51 compute-0 ovn_controller[94850]: 2026-01-22T22:31:51Z|00291|binding|INFO|Releasing lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 from this chassis (sb_readonly=0)
Jan 22 22:31:51 compute-0 ovn_controller[94850]: 2026-01-22T22:31:51Z|00292|binding|INFO|Setting lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 down in Southbound
Jan 22 22:31:51 compute-0 ovn_controller[94850]: 2026-01-22T22:31:51Z|00293|binding|INFO|Removing iface tap8e8cfdc3-60 ovn-installed in OVS
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.452 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:60:70 10.100.0.14'], port_security=['fa:16:3e:b3:60:70 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f234f62b-5371-4527-94e7-91cf5da3055e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3b57348-3994-471b-bd73-e78507392f5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2c3e5cc-ee0d-48e7-8eab-3e968c7ed6fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=8e8cfdc3-60bc-4edf-89ba-c53573ea3141) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.454 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 in datapath f234f62b-5371-4527-94e7-91cf5da3055e unbound from our chassis
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.455 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f234f62b-5371-4527-94e7-91cf5da3055e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.457 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[42a24bad-83a2-45df-b91c-157e75cbc61d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.457 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e namespace which is not needed anymore
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.469 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:51 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000058.scope: Deactivated successfully.
Jan 22 22:31:51 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000058.scope: Consumed 12.719s CPU time.
Jan 22 22:31:51 compute-0 systemd-machined[154006]: Machine qemu-35-instance-00000058 terminated.
Jan 22 22:31:51 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[222685]: [NOTICE]   (222709) : haproxy version is 2.8.14-c23fe91
Jan 22 22:31:51 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[222685]: [NOTICE]   (222709) : path to executable is /usr/sbin/haproxy
Jan 22 22:31:51 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[222685]: [WARNING]  (222709) : Exiting Master process...
Jan 22 22:31:51 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[222685]: [ALERT]    (222709) : Current worker (222725) exited with code 143 (Terminated)
Jan 22 22:31:51 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[222685]: [WARNING]  (222709) : All workers exited. Exiting... (0)
Jan 22 22:31:51 compute-0 systemd[1]: libpod-60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d.scope: Deactivated successfully.
Jan 22 22:31:51 compute-0 podman[223108]: 2026-01-22 22:31:51.608389476 +0000 UTC m=+0.049868107 container died 60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 22:31:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d-userdata-shm.mount: Deactivated successfully.
Jan 22 22:31:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-414d0b7f4032cc48121e00a1957d860e562adde16c3e0011d509534283378c86-merged.mount: Deactivated successfully.
Jan 22 22:31:51 compute-0 podman[223108]: 2026-01-22 22:31:51.648652 +0000 UTC m=+0.090130631 container cleanup 60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 22:31:51 compute-0 systemd[1]: libpod-conmon-60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d.scope: Deactivated successfully.
Jan 22 22:31:51 compute-0 podman[223137]: 2026-01-22 22:31:51.725521946 +0000 UTC m=+0.051304423 container remove 60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.734 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[69e7b5a1-17be-4712-a73d-87420db0c66d]: (4, ('Thu Jan 22 10:31:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e (60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d)\n60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d\nThu Jan 22 10:31:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e (60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d)\n60cdd2672806d36369cf610e0c45d94a48edfb25a18ff73a12e65b41e72b529d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.736 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cce20b-7bc3-461f-9389-6a0252f0fdd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.737 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf234f62b-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.741 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:51 compute-0 kernel: tapf234f62b-50: left promiscuous mode
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.747 182729 DEBUG nova.compute.manager [req-90451521-5e58-4334-ad9f-d3592a6e8e2e req-1c5ea743-b75c-4735-a5fc-29963913fbdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-unplugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.748 182729 DEBUG oslo_concurrency.lockutils [req-90451521-5e58-4334-ad9f-d3592a6e8e2e req-1c5ea743-b75c-4735-a5fc-29963913fbdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.748 182729 DEBUG oslo_concurrency.lockutils [req-90451521-5e58-4334-ad9f-d3592a6e8e2e req-1c5ea743-b75c-4735-a5fc-29963913fbdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.748 182729 DEBUG oslo_concurrency.lockutils [req-90451521-5e58-4334-ad9f-d3592a6e8e2e req-1c5ea743-b75c-4735-a5fc-29963913fbdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.748 182729 DEBUG nova.compute.manager [req-90451521-5e58-4334-ad9f-d3592a6e8e2e req-1c5ea743-b75c-4735-a5fc-29963913fbdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] No waiting events found dispatching network-vif-unplugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.748 182729 WARNING nova.compute.manager [req-90451521-5e58-4334-ad9f-d3592a6e8e2e req-1c5ea743-b75c-4735-a5fc-29963913fbdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received unexpected event network-vif-unplugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for instance with vm_state active and task_state powering-off.
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.757 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.760 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e32160c4-e872-4347-9e92-2f9f83ce6956]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.780 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6e017a7a-e537-44f8-8c11-5bb02522315b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.782 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bddf7d20-e8ed-4682-b86f-7ac26af17dc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.802 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f00738-56bb-42fb-a884-a8546bbbed7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472408, 'reachable_time': 43644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223172, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.805 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:31:51 compute-0 systemd[1]: run-netns-ovnmeta\x2df234f62b\x2d5371\x2d4527\x2d94e7\x2d91cf5da3055e.mount: Deactivated successfully.
Jan 22 22:31:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:51.805 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[faac1eae-e975-4bca-8b4d-0f136ffb5c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.878 182729 DEBUG nova.network.neutron [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Updating instance_info_cache with network_info: [{"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.898 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-63595c63-b40f-4491-be23-cc90675eb94e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.898 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Instance network_info: |[{"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.899 182729 DEBUG oslo_concurrency.lockutils [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-63595c63-b40f-4491-be23-cc90675eb94e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.899 182729 DEBUG nova.network.neutron [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Refreshing network info cache for port 108a1530-5cf0-483e-8cf2-e5ea0abc867d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.902 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Start _get_guest_xml network_info=[{"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.907 182729 WARNING nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.912 182729 DEBUG nova.virt.libvirt.host [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.912 182729 DEBUG nova.virt.libvirt.host [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.917 182729 DEBUG nova.virt.libvirt.host [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.918 182729 DEBUG nova.virt.libvirt.host [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.918 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.919 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.919 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.919 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.919 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.920 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.920 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.920 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.920 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.920 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.920 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.921 182729 DEBUG nova.virt.hardware [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.924 182729 DEBUG nova.virt.libvirt.vif [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772595314',display_name='tempest-ServerDiskConfigTestJSON-server-772595314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772595314',id=92,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-cdf405e3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:48Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=63595c63-b40f-4491-be23-cc90675eb94e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.924 182729 DEBUG nova.network.os_vif_util [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.925 182729 DEBUG nova.network.os_vif_util [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:c8:53,bridge_name='br-int',has_traffic_filtering=True,id=108a1530-5cf0-483e-8cf2-e5ea0abc867d,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap108a1530-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.926 182729 DEBUG nova.objects.instance [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63595c63-b40f-4491-be23-cc90675eb94e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.944 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <uuid>63595c63-b40f-4491-be23-cc90675eb94e</uuid>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <name>instance-0000005c</name>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-772595314</nova:name>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:31:51</nova:creationTime>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         <nova:port uuid="108a1530-5cf0-483e-8cf2-e5ea0abc867d">
Jan 22 22:31:51 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <system>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <entry name="serial">63595c63-b40f-4491-be23-cc90675eb94e</entry>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <entry name="uuid">63595c63-b40f-4491-be23-cc90675eb94e</entry>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </system>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <os>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   </os>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <features>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   </features>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk.config"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:9c:c8:53"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <target dev="tap108a1530-5c"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/console.log" append="off"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <video>
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </video>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:31:51 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:31:51 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:31:51 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:31:51 compute-0 nova_compute[182725]: </domain>
Jan 22 22:31:51 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.945 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Preparing to wait for external event network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.945 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.945 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.945 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.946 182729 DEBUG nova.virt.libvirt.vif [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772595314',display_name='tempest-ServerDiskConfigTestJSON-server-772595314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772595314',id=92,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-cdf405e3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:48Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=63595c63-b40f-4491-be23-cc90675eb94e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.946 182729 DEBUG nova.network.os_vif_util [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.947 182729 DEBUG nova.network.os_vif_util [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:c8:53,bridge_name='br-int',has_traffic_filtering=True,id=108a1530-5cf0-483e-8cf2-e5ea0abc867d,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap108a1530-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.947 182729 DEBUG os_vif [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:c8:53,bridge_name='br-int',has_traffic_filtering=True,id=108a1530-5cf0-483e-8cf2-e5ea0abc867d,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap108a1530-5c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.947 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.948 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.948 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.950 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.950 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap108a1530-5c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.951 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap108a1530-5c, col_values=(('external_ids', {'iface-id': '108a1530-5cf0-483e-8cf2-e5ea0abc867d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9c:c8:53', 'vm-uuid': '63595c63-b40f-4491-be23-cc90675eb94e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:51 compute-0 NetworkManager[54954]: <info>  [1769121111.9534] manager: (tap108a1530-5c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.953 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:51 compute-0 nova_compute[182725]: 2026-01-22 22:31:51.962 182729 INFO os_vif [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:c8:53,bridge_name='br-int',has_traffic_filtering=True,id=108a1530-5cf0-483e-8cf2-e5ea0abc867d,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap108a1530-5c')
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.022 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.022 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.023 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:9c:c8:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.023 182729 INFO nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Using config drive
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.257 182729 INFO nova.virt.libvirt.driver [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance shutdown successfully after 3 seconds.
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.265 182729 INFO nova.virt.libvirt.driver [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance destroyed successfully.
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.266 182729 DEBUG nova.objects.instance [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'numa_topology' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.284 182729 DEBUG nova.compute.manager [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.399 182729 INFO nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Creating config drive at /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk.config
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.404 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfhal5clq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.448 182729 DEBUG oslo_concurrency.lockutils [None req-202d061b-1368-4b96-8ffc-98d546eea645 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.544 182729 DEBUG oslo_concurrency.processutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfhal5clq" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:52 compute-0 NetworkManager[54954]: <info>  [1769121112.6196] manager: (tap108a1530-5c): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Jan 22 22:31:52 compute-0 systemd-udevd[223093]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:31:52 compute-0 kernel: tap108a1530-5c: entered promiscuous mode
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.627 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:52 compute-0 ovn_controller[94850]: 2026-01-22T22:31:52Z|00294|binding|INFO|Claiming lport 108a1530-5cf0-483e-8cf2-e5ea0abc867d for this chassis.
Jan 22 22:31:52 compute-0 ovn_controller[94850]: 2026-01-22T22:31:52Z|00295|binding|INFO|108a1530-5cf0-483e-8cf2-e5ea0abc867d: Claiming fa:16:3e:9c:c8:53 10.100.0.12
Jan 22 22:31:52 compute-0 NetworkManager[54954]: <info>  [1769121112.6338] device (tap108a1530-5c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:31:52 compute-0 NetworkManager[54954]: <info>  [1769121112.6351] device (tap108a1530-5c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.635 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:c8:53 10.100.0.12'], port_security=['fa:16:3e:9c:c8:53 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '63595c63-b40f-4491-be23-cc90675eb94e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=108a1530-5cf0-483e-8cf2-e5ea0abc867d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.636 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 108a1530-5cf0-483e-8cf2-e5ea0abc867d in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.638 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 22:31:52 compute-0 ovn_controller[94850]: 2026-01-22T22:31:52Z|00296|binding|INFO|Setting lport 108a1530-5cf0-483e-8cf2-e5ea0abc867d ovn-installed in OVS
Jan 22 22:31:52 compute-0 ovn_controller[94850]: 2026-01-22T22:31:52Z|00297|binding|INFO|Setting lport 108a1530-5cf0-483e-8cf2-e5ea0abc867d up in Southbound
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.644 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.648 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.651 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d759fc76-18d1-4b80-a4b5-7cc80554585e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.653 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.655 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.655 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0459e0-87bb-467e-875e-a474b45aafbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.660 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4d8f8a29-a1d7-4f01-80dd-8002e3588acd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 systemd-machined[154006]: New machine qemu-37-instance-0000005c.
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.679 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f3ce3d-5256-4bff-a2f3-9a24c2d56f76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-0000005c.
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.693 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[230f5fa5-30c6-4e51-8520-0ce15e6de127]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.724 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[52e9c7cb-faeb-4233-8157-c90c276454fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.729 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6db1e107-dd7e-48a8-984a-4b6c22bee118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 NetworkManager[54954]: <info>  [1769121112.7309] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/151)
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.759 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[21f99a86-202b-47d3-a20d-a231a744f0f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.764 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[be8affdf-5c95-428b-9e7f-639144c30a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 NetworkManager[54954]: <info>  [1769121112.7898] device (tap354683a7-30): carrier: link connected
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.796 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae3f804-dbed-47dd-8c01-cc23102b8faf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.817 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a26c6e-ceb9-444a-bd7c-b5f9ee4c2d2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475039, 'reachable_time': 38638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223234, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.836 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1529962e-f6ab-4920-9b64-6c47077ec134]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475039, 'tstamp': 475039}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223235, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.855 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0b363d-2724-4106-8eb0-428871e1c3f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475039, 'reachable_time': 38638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223236, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.898 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[028e945c-f9d7-4f38-8549-a9a9b64e4aae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.991 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[21f8ac85-8553-4c0a-a8d4-922ac76a935d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.993 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.993 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:52 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:52.994 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:52 compute-0 nova_compute[182725]: 2026-01-22 22:31:52.997 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:52 compute-0 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 22:31:52 compute-0 NetworkManager[54954]: <info>  [1769121112.9976] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.004 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:53.006 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.007 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:53 compute-0 ovn_controller[94850]: 2026-01-22T22:31:53Z|00298|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=0)
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.019 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.023 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:53.024 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:53.025 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0f32d68f-557b-4a3c-84e9-d8664b8b440d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:53.025 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:31:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:53.026 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.288 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121113.28808, 63595c63-b40f-4491-be23-cc90675eb94e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.288 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] VM Started (Lifecycle Event)
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.313 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.317 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121113.2882516, 63595c63-b40f-4491-be23-cc90675eb94e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.318 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] VM Paused (Lifecycle Event)
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.343 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.347 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.365 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:31:53 compute-0 podman[223284]: 2026-01-22 22:31:53.427084771 +0000 UTC m=+0.042264025 container create 98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:31:53 compute-0 systemd[1]: Started libpod-conmon-98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454.scope.
Jan 22 22:31:53 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:31:53 compute-0 podman[223284]: 2026-01-22 22:31:53.403937918 +0000 UTC m=+0.019117182 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:31:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9c1b2b6d4c2124bfdf4955b9871bd481bb70a05562a3af3c6318d07da4fa223/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:31:53 compute-0 podman[223284]: 2026-01-22 22:31:53.517884327 +0000 UTC m=+0.133063621 container init 98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:31:53 compute-0 podman[223284]: 2026-01-22 22:31:53.526237738 +0000 UTC m=+0.141417012 container start 98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:31:53 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223299]: [NOTICE]   (223303) : New worker (223305) forked
Jan 22 22:31:53 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223299]: [NOTICE]   (223303) : Loading success.
Jan 22 22:31:53 compute-0 nova_compute[182725]: 2026-01-22 22:31:53.939 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:54 compute-0 ovn_controller[94850]: 2026-01-22T22:31:54Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.160 182729 DEBUG nova.compute.manager [req-9fd5d552-17ff-44d8-8f94-031b36ac7db0 req-95a6e282-df18-4add-86ea-9e764d208336 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.160 182729 DEBUG oslo_concurrency.lockutils [req-9fd5d552-17ff-44d8-8f94-031b36ac7db0 req-95a6e282-df18-4add-86ea-9e764d208336 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.161 182729 DEBUG oslo_concurrency.lockutils [req-9fd5d552-17ff-44d8-8f94-031b36ac7db0 req-95a6e282-df18-4add-86ea-9e764d208336 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.161 182729 DEBUG oslo_concurrency.lockutils [req-9fd5d552-17ff-44d8-8f94-031b36ac7db0 req-95a6e282-df18-4add-86ea-9e764d208336 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.162 182729 DEBUG nova.compute.manager [req-9fd5d552-17ff-44d8-8f94-031b36ac7db0 req-95a6e282-df18-4add-86ea-9e764d208336 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] No waiting events found dispatching network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.162 182729 WARNING nova.compute.manager [req-9fd5d552-17ff-44d8-8f94-031b36ac7db0 req-95a6e282-df18-4add-86ea-9e764d208336 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received unexpected event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for instance with vm_state stopped and task_state None.
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.233 182729 DEBUG nova.compute.manager [req-4c8941e9-01c6-40ae-8ea4-7c477498bdbe req-79b51c7a-c202-4969-b2d0-fe1aee0912e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received event network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.234 182729 DEBUG oslo_concurrency.lockutils [req-4c8941e9-01c6-40ae-8ea4-7c477498bdbe req-79b51c7a-c202-4969-b2d0-fe1aee0912e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.234 182729 DEBUG oslo_concurrency.lockutils [req-4c8941e9-01c6-40ae-8ea4-7c477498bdbe req-79b51c7a-c202-4969-b2d0-fe1aee0912e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.234 182729 DEBUG oslo_concurrency.lockutils [req-4c8941e9-01c6-40ae-8ea4-7c477498bdbe req-79b51c7a-c202-4969-b2d0-fe1aee0912e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.235 182729 DEBUG nova.compute.manager [req-4c8941e9-01c6-40ae-8ea4-7c477498bdbe req-79b51c7a-c202-4969-b2d0-fe1aee0912e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Processing event network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.236 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.242 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121115.242591, 63595c63-b40f-4491-be23-cc90675eb94e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.243 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] VM Resumed (Lifecycle Event)
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.245 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.248 182729 INFO nova.virt.libvirt.driver [-] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Instance spawned successfully.
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.248 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.265 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.270 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.270 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.270 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.271 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.271 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.271 182729 DEBUG nova.virt.libvirt.driver [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.276 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.301 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.348 182729 INFO nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Took 6.64 seconds to spawn the instance on the hypervisor.
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.349 182729 DEBUG nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.471 182729 INFO nova.compute.manager [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Took 7.32 seconds to build instance.
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.499 182729 DEBUG oslo_concurrency.lockutils [None req-d6464b1f-60b0-485e-a6df-374fc8564afc b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.565 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'flavor' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.599 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'info_cache' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.622 182729 DEBUG oslo_concurrency.lockutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.623 182729 DEBUG oslo_concurrency.lockutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquired lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:31:55 compute-0 nova_compute[182725]: 2026-01-22 22:31:55.623 182729 DEBUG nova.network.neutron [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:31:56 compute-0 nova_compute[182725]: 2026-01-22 22:31:56.213 182729 DEBUG nova.network.neutron [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Updated VIF entry in instance network info cache for port 108a1530-5cf0-483e-8cf2-e5ea0abc867d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:31:56 compute-0 nova_compute[182725]: 2026-01-22 22:31:56.214 182729 DEBUG nova.network.neutron [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Updating instance_info_cache with network_info: [{"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:56 compute-0 nova_compute[182725]: 2026-01-22 22:31:56.238 182729 DEBUG oslo_concurrency.lockutils [req-fc2c409e-7755-4789-8568-b1b767f02a42 req-2058552e-c752-451c-a99e-34db5d69766f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-63595c63-b40f-4491-be23-cc90675eb94e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:56 compute-0 nova_compute[182725]: 2026-01-22 22:31:56.953 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:57 compute-0 nova_compute[182725]: 2026-01-22 22:31:57.373 182729 DEBUG nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received event network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:31:57 compute-0 nova_compute[182725]: 2026-01-22 22:31:57.374 182729 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:57 compute-0 nova_compute[182725]: 2026-01-22 22:31:57.374 182729 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:57 compute-0 nova_compute[182725]: 2026-01-22 22:31:57.374 182729 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:57 compute-0 nova_compute[182725]: 2026-01-22 22:31:57.375 182729 DEBUG nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] No waiting events found dispatching network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:31:57 compute-0 nova_compute[182725]: 2026-01-22 22:31:57.375 182729 WARNING nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received unexpected event network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d for instance with vm_state active and task_state None.
Jan 22 22:31:58 compute-0 podman[223314]: 2026-01-22 22:31:58.13065594 +0000 UTC m=+0.060445013 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 22:31:58 compute-0 podman[223315]: 2026-01-22 22:31:58.146692644 +0000 UTC m=+0.061144801 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.283 182729 DEBUG nova.network.neutron [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Updating instance_info_cache with network_info: [{"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.303 182729 DEBUG oslo_concurrency.lockutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Releasing lock "refresh_cache-815ebbb8-e2c4-4f72-8048-df7c53f1439a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.330 182729 INFO nova.virt.libvirt.driver [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance destroyed successfully.
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.330 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'numa_topology' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.347 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'resources' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.372 182729 DEBUG nova.virt.libvirt.vif [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-871599483',display_name='tempest-ListServerFiltersTestJSON-instance-871599483',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-871599483',id=88,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-34audv8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:52Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=815ebbb8-e2c4-4f72-8048-df7c53f1439a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.373 182729 DEBUG nova.network.os_vif_util [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.374 182729 DEBUG nova.network.os_vif_util [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.374 182729 DEBUG os_vif [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.376 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.377 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e8cfdc3-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.379 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.381 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.388 182729 INFO os_vif [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60')
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.396 182729 DEBUG nova.virt.libvirt.driver [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Start _get_guest_xml network_info=[{"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.400 182729 WARNING nova.virt.libvirt.driver [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.405 182729 DEBUG nova.virt.libvirt.host [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.406 182729 DEBUG nova.virt.libvirt.host [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.411 182729 DEBUG nova.virt.libvirt.host [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.411 182729 DEBUG nova.virt.libvirt.host [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.413 182729 DEBUG nova.virt.libvirt.driver [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.413 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.414 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.414 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.414 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.415 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.415 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.415 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.415 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.416 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.416 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.416 182729 DEBUG nova.virt.hardware [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.416 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.453 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:58.482 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:58.483 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.486 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.553 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.config --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.554 182729 DEBUG oslo_concurrency.lockutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.555 182729 DEBUG oslo_concurrency.lockutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.556 182729 DEBUG oslo_concurrency.lockutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.557 182729 DEBUG nova.virt.libvirt.vif [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-871599483',display_name='tempest-ListServerFiltersTestJSON-instance-871599483',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-871599483',id=88,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-34audv8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:52Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=815ebbb8-e2c4-4f72-8048-df7c53f1439a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.557 182729 DEBUG nova.network.os_vif_util [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.559 182729 DEBUG nova.network.os_vif_util [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.560 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'pci_devices' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.578 182729 DEBUG nova.virt.libvirt.driver [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <uuid>815ebbb8-e2c4-4f72-8048-df7c53f1439a</uuid>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <name>instance-00000058</name>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-871599483</nova:name>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:31:58</nova:creationTime>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:user uuid="b6f50d0e6a7444f0ac9c928363915afb">tempest-ListServerFiltersTestJSON-1169398826-project-member</nova:user>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:project uuid="802c49a328ca49e3a4ea4e46b9a9f5eb">tempest-ListServerFiltersTestJSON-1169398826</nova:project>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         <nova:port uuid="8e8cfdc3-60bc-4edf-89ba-c53573ea3141">
Jan 22 22:31:58 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <system>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <entry name="serial">815ebbb8-e2c4-4f72-8048-df7c53f1439a</entry>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <entry name="uuid">815ebbb8-e2c4-4f72-8048-df7c53f1439a</entry>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </system>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <os>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   </os>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <features>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   </features>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.config"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:b3:60:70"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <target dev="tap8e8cfdc3-60"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/console.log" append="off"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <video>
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </video>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:31:58 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:31:58 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:31:58 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:31:58 compute-0 nova_compute[182725]: </domain>
Jan 22 22:31:58 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.580 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.638 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.640 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.699 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.701 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.718 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.777 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.780 182729 DEBUG nova.virt.disk.api [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Checking if we can resize image /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.781 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.841 182729 DEBUG oslo_concurrency.processutils [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.842 182729 DEBUG nova.virt.disk.api [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Cannot resize image /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.843 182729 DEBUG nova.objects.instance [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'migration_context' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.865 182729 DEBUG nova.virt.libvirt.vif [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-871599483',display_name='tempest-ListServerFiltersTestJSON-instance-871599483',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-871599483',id=88,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-34audv8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:52Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=815ebbb8-e2c4-4f72-8048-df7c53f1439a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.866 182729 DEBUG nova.network.os_vif_util [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.869 182729 DEBUG nova.network.os_vif_util [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.870 182729 DEBUG os_vif [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.871 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.872 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.875 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.881 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.881 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e8cfdc3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.882 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8e8cfdc3-60, col_values=(('external_ids', {'iface-id': '8e8cfdc3-60bc-4edf-89ba-c53573ea3141', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:60:70', 'vm-uuid': '815ebbb8-e2c4-4f72-8048-df7c53f1439a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.884 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:58 compute-0 NetworkManager[54954]: <info>  [1769121118.8855] manager: (tap8e8cfdc3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.890 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.895 182729 INFO os_vif [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60')
Jan 22 22:31:58 compute-0 nova_compute[182725]: 2026-01-22 22:31:58.942 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 NetworkManager[54954]: <info>  [1769121119.0129] manager: (tap8e8cfdc3-60): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Jan 22 22:31:59 compute-0 kernel: tap8e8cfdc3-60: entered promiscuous mode
Jan 22 22:31:59 compute-0 ovn_controller[94850]: 2026-01-22T22:31:59Z|00299|binding|INFO|Claiming lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for this chassis.
Jan 22 22:31:59 compute-0 ovn_controller[94850]: 2026-01-22T22:31:59Z|00300|binding|INFO|8e8cfdc3-60bc-4edf-89ba-c53573ea3141: Claiming fa:16:3e:b3:60:70 10.100.0.14
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.017 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.031 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 ovn_controller[94850]: 2026-01-22T22:31:59Z|00301|binding|INFO|Setting lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 ovn-installed in OVS
Jan 22 22:31:59 compute-0 ovn_controller[94850]: 2026-01-22T22:31:59Z|00302|binding|INFO|Setting lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 up in Southbound
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.033 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:60:70 10.100.0.14'], port_security=['fa:16:3e:b3:60:70 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f234f62b-5371-4527-94e7-91cf5da3055e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f3b57348-3994-471b-bd73-e78507392f5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2c3e5cc-ee0d-48e7-8eab-3e968c7ed6fc, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=8e8cfdc3-60bc-4edf-89ba-c53573ea3141) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.034 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 in datapath f234f62b-5371-4527-94e7-91cf5da3055e bound to our chassis
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.035 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.035 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.052 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8daffead-1c8d-4775-a124-d15cad65d429]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.052 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf234f62b-51 in ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.057 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf234f62b-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.057 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4547b5a1-15da-4423-9c22-43836a69c9f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.058 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d467410b-d052-4663-843f-e8bdb3eb0d4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 systemd-udevd[223388]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.074 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[08fde7a7-0bba-4dc4-b15a-072386848412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 systemd-machined[154006]: New machine qemu-38-instance-00000058.
Jan 22 22:31:59 compute-0 NetworkManager[54954]: <info>  [1769121119.0935] device (tap8e8cfdc3-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:31:59 compute-0 NetworkManager[54954]: <info>  [1769121119.0949] device (tap8e8cfdc3-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:31:59 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-00000058.
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.094 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e55509de-530e-48c6-9673-dc5078492066]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.144 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1d416d80-1b45-4e9e-91d3-1232ab3e58e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.156 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0b77b695-fbb9-4af6-a51f-55879566d427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 NetworkManager[54954]: <info>  [1769121119.1603] manager: (tapf234f62b-50): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.192 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[5fed1c17-97db-4a90-a836-21146a380cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.199 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[0023bfa2-59a8-4442-9508-66bdff32da6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 NetworkManager[54954]: <info>  [1769121119.2345] device (tapf234f62b-50): carrier: link connected
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.244 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[54cf30c5-c13c-41fa-be9c-ab8049762a22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.271 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9d014dc8-a376-49be-ab22-c7bcb97472a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf234f62b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:3d:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475683, 'reachable_time': 26374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223421, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.296 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[db01ec32-ab43-4792-a47b-3db46ff77900]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:3df6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475683, 'tstamp': 475683}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223422, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.324 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a50918-da45-46e2-9fd5-6c9bce7f5454]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf234f62b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:3d:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475683, 'reachable_time': 26374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223423, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.364 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f49071-630b-47b5-b611-caba468ebede]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.434 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4aef0828-cc81-4364-b5bf-8556c32ccb99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.435 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf234f62b-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.436 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.436 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf234f62b-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.438 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 kernel: tapf234f62b-50: entered promiscuous mode
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.442 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 NetworkManager[54954]: <info>  [1769121119.4434] manager: (tapf234f62b-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.445 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf234f62b-50, col_values=(('external_ids', {'iface-id': '0a1fd4a8-b506-4c9d-9846-1c0ab542e465'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.446 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 ovn_controller[94850]: 2026-01-22T22:31:59Z|00303|binding|INFO|Releasing lport 0a1fd4a8-b506-4c9d-9846-1c0ab542e465 from this chassis (sb_readonly=0)
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.449 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.450 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[42505bd8-6db9-4af9-bc2c-62c4cbfcd719]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.451 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:31:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:31:59.452 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'env', 'PROCESS_TAG=haproxy-f234f62b-5371-4527-94e7-91cf5da3055e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f234f62b-5371-4527-94e7-91cf5da3055e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.457 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.545 182729 DEBUG nova.virt.libvirt.driver [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.753 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 815ebbb8-e2c4-4f72-8048-df7c53f1439a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.754 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121119.7524529, 815ebbb8-e2c4-4f72-8048-df7c53f1439a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.755 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] VM Resumed (Lifecycle Event)
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.757 182729 DEBUG nova.compute.manager [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.762 182729 INFO nova.virt.libvirt.driver [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance rebooted successfully.
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.762 182729 DEBUG nova.compute.manager [None req-60dd2b65-a752-410e-a544-d94843500d9f b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.772 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.776 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.819 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] During sync_power_state the instance has a pending task (powering-on). Skip.
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.820 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121119.7541904, 815ebbb8-e2c4-4f72-8048-df7c53f1439a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.820 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] VM Started (Lifecycle Event)
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.842 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:31:59 compute-0 nova_compute[182725]: 2026-01-22 22:31:59.846 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:31:59 compute-0 podman[223460]: 2026-01-22 22:31:59.955231043 +0000 UTC m=+0.067178962 container create 4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:32:00 compute-0 systemd[1]: Started libpod-conmon-4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7.scope.
Jan 22 22:32:00 compute-0 podman[223460]: 2026-01-22 22:31:59.920588171 +0000 UTC m=+0.032536150 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:32:00 compute-0 nova_compute[182725]: 2026-01-22 22:32:00.039 182729 DEBUG nova.compute.manager [req-7e8c0d4e-e92c-4ae1-b100-ce5e50016bbe req-e2692479-eabd-47b7-9faa-ceca778a8d4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:00 compute-0 nova_compute[182725]: 2026-01-22 22:32:00.040 182729 DEBUG oslo_concurrency.lockutils [req-7e8c0d4e-e92c-4ae1-b100-ce5e50016bbe req-e2692479-eabd-47b7-9faa-ceca778a8d4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:00 compute-0 nova_compute[182725]: 2026-01-22 22:32:00.040 182729 DEBUG oslo_concurrency.lockutils [req-7e8c0d4e-e92c-4ae1-b100-ce5e50016bbe req-e2692479-eabd-47b7-9faa-ceca778a8d4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:00 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:32:00 compute-0 nova_compute[182725]: 2026-01-22 22:32:00.041 182729 DEBUG oslo_concurrency.lockutils [req-7e8c0d4e-e92c-4ae1-b100-ce5e50016bbe req-e2692479-eabd-47b7-9faa-ceca778a8d4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:00 compute-0 nova_compute[182725]: 2026-01-22 22:32:00.041 182729 DEBUG nova.compute.manager [req-7e8c0d4e-e92c-4ae1-b100-ce5e50016bbe req-e2692479-eabd-47b7-9faa-ceca778a8d4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] No waiting events found dispatching network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:00 compute-0 nova_compute[182725]: 2026-01-22 22:32:00.042 182729 WARNING nova.compute.manager [req-7e8c0d4e-e92c-4ae1-b100-ce5e50016bbe req-e2692479-eabd-47b7-9faa-ceca778a8d4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received unexpected event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for instance with vm_state active and task_state None.
Jan 22 22:32:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89ae5f6b14701300d3a0edbf16ef68e9f059c1e8207c1da18d9e251b811e78af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:32:00 compute-0 podman[223460]: 2026-01-22 22:32:00.083983415 +0000 UTC m=+0.195931374 container init 4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:32:00 compute-0 podman[223460]: 2026-01-22 22:32:00.095661939 +0000 UTC m=+0.207609868 container start 4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 22:32:00 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [NOTICE]   (223479) : New worker (223481) forked
Jan 22 22:32:00 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [NOTICE]   (223479) : Loading success.
Jan 22 22:32:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:01.486 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:01 compute-0 kernel: tapb56a4401-4c (unregistering): left promiscuous mode
Jan 22 22:32:01 compute-0 NetworkManager[54954]: <info>  [1769121121.7292] device (tapb56a4401-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:32:01 compute-0 ovn_controller[94850]: 2026-01-22T22:32:01Z|00304|binding|INFO|Releasing lport b56a4401-4c89-482a-a347-ca080a879f8f from this chassis (sb_readonly=0)
Jan 22 22:32:01 compute-0 nova_compute[182725]: 2026-01-22 22:32:01.735 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:01 compute-0 ovn_controller[94850]: 2026-01-22T22:32:01Z|00305|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f down in Southbound
Jan 22 22:32:01 compute-0 ovn_controller[94850]: 2026-01-22T22:32:01Z|00306|binding|INFO|Removing iface tapb56a4401-4c ovn-installed in OVS
Jan 22 22:32:01 compute-0 nova_compute[182725]: 2026-01-22 22:32:01.739 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:01 compute-0 nova_compute[182725]: 2026-01-22 22:32:01.749 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:01.750 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:01.754 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:32:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:01.757 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:32:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:01.760 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9c2cd7-42c2-4032-8a82-d0d30f614e9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:01.761 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:32:01 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 22 22:32:01 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000057.scope: Consumed 13.727s CPU time.
Jan 22 22:32:01 compute-0 systemd-machined[154006]: Machine qemu-36-instance-00000057 terminated.
Jan 22 22:32:01 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222988]: [NOTICE]   (222992) : haproxy version is 2.8.14-c23fe91
Jan 22 22:32:01 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222988]: [NOTICE]   (222992) : path to executable is /usr/sbin/haproxy
Jan 22 22:32:01 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222988]: [WARNING]  (222992) : Exiting Master process...
Jan 22 22:32:01 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222988]: [ALERT]    (222992) : Current worker (222994) exited with code 143 (Terminated)
Jan 22 22:32:01 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[222988]: [WARNING]  (222992) : All workers exited. Exiting... (0)
Jan 22 22:32:01 compute-0 systemd[1]: libpod-ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079.scope: Deactivated successfully.
Jan 22 22:32:01 compute-0 podman[223509]: 2026-01-22 22:32:01.935370094 +0000 UTC m=+0.055459347 container died ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 22:32:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079-userdata-shm.mount: Deactivated successfully.
Jan 22 22:32:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-9a1b4bb1be6b7574cf5f9bf1d4a6b3ceb9d4591a90287c9786d99a030e11a628-merged.mount: Deactivated successfully.
Jan 22 22:32:01 compute-0 podman[223509]: 2026-01-22 22:32:01.971799312 +0000 UTC m=+0.091888565 container cleanup ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 22:32:01 compute-0 systemd[1]: libpod-conmon-ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079.scope: Deactivated successfully.
Jan 22 22:32:02 compute-0 podman[223545]: 2026-01-22 22:32:02.068016115 +0000 UTC m=+0.057404767 container remove ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.077 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2b06ff-dce0-4306-8b9d-8b84156d987d]: (4, ('Thu Jan 22 10:32:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079)\necd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079\nThu Jan 22 10:32:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (ecd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079)\necd732c3c5303b79ef4f79219a5eaa077f063022e1f5698df165d53f43551079\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.079 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bcf8d874-edb3-47ed-b7ab-e0474811d385]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.080 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.083 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:02 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.099 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.107 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f637bc00-ca2a-45fb-99c3-7daddbc743a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.122 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9cb1c8-bd20-4c12-8d6b-f8c9d72f6821]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.124 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aa41a87d-a607-4768-b173-4d5621aa7dcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.144 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa56eec-023f-49fd-8d5b-5444e7e5c372]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473859, 'reachable_time': 37447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223569, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.147 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.147 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdfceda-0372-41a5-9b70-a73fd819548d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:32:02 compute-0 podman[223570]: 2026-01-22 22:32:02.228748192 +0000 UTC m=+0.061474299 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.562 182729 INFO nova.virt.libvirt.driver [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance shutdown successfully after 13 seconds.
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.570 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance destroyed successfully.
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.571 182729 DEBUG nova.objects.instance [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'numa_topology' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.591 182729 DEBUG nova.compute.manager [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.677 182729 DEBUG nova.compute.manager [req-f9d37660-bcfd-434e-af42-185b85c67357 req-4b5281b0-6b9e-441c-a3bc-9e01964476fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.678 182729 DEBUG oslo_concurrency.lockutils [req-f9d37660-bcfd-434e-af42-185b85c67357 req-4b5281b0-6b9e-441c-a3bc-9e01964476fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.678 182729 DEBUG oslo_concurrency.lockutils [req-f9d37660-bcfd-434e-af42-185b85c67357 req-4b5281b0-6b9e-441c-a3bc-9e01964476fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.678 182729 DEBUG oslo_concurrency.lockutils [req-f9d37660-bcfd-434e-af42-185b85c67357 req-4b5281b0-6b9e-441c-a3bc-9e01964476fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.679 182729 DEBUG nova.compute.manager [req-f9d37660-bcfd-434e-af42-185b85c67357 req-4b5281b0-6b9e-441c-a3bc-9e01964476fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] No waiting events found dispatching network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.679 182729 WARNING nova.compute.manager [req-f9d37660-bcfd-434e-af42-185b85c67357 req-4b5281b0-6b9e-441c-a3bc-9e01964476fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received unexpected event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for instance with vm_state active and task_state None.
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.692 182729 DEBUG oslo_concurrency.lockutils [None req-1d0932a5-f10a-427e-a0bc-459e3970f2d5 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.714 182729 DEBUG nova.compute.manager [req-48641b44-7a82-434b-85b6-949792e29606 req-f4db5554-d7e8-4666-b61f-e8d46dfe4a16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.715 182729 DEBUG oslo_concurrency.lockutils [req-48641b44-7a82-434b-85b6-949792e29606 req-f4db5554-d7e8-4666-b61f-e8d46dfe4a16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.716 182729 DEBUG oslo_concurrency.lockutils [req-48641b44-7a82-434b-85b6-949792e29606 req-f4db5554-d7e8-4666-b61f-e8d46dfe4a16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.719 182729 DEBUG oslo_concurrency.lockutils [req-48641b44-7a82-434b-85b6-949792e29606 req-f4db5554-d7e8-4666-b61f-e8d46dfe4a16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.719 182729 DEBUG nova.compute.manager [req-48641b44-7a82-434b-85b6-949792e29606 req-f4db5554-d7e8-4666-b61f-e8d46dfe4a16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.719 182729 WARNING nova.compute.manager [req-48641b44-7a82-434b-85b6-949792e29606 req-f4db5554-d7e8-4666-b61f-e8d46dfe4a16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state stopped and task_state None.
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.815 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.816 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.817 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.817 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.818 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.838 182729 INFO nova.compute.manager [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Terminating instance
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.854 182729 DEBUG nova.compute.manager [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:32:02 compute-0 kernel: tap108a1530-5c (unregistering): left promiscuous mode
Jan 22 22:32:02 compute-0 NetworkManager[54954]: <info>  [1769121122.8884] device (tap108a1530-5c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.898 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:02 compute-0 ovn_controller[94850]: 2026-01-22T22:32:02Z|00307|binding|INFO|Releasing lport 108a1530-5cf0-483e-8cf2-e5ea0abc867d from this chassis (sb_readonly=0)
Jan 22 22:32:02 compute-0 ovn_controller[94850]: 2026-01-22T22:32:02Z|00308|binding|INFO|Setting lport 108a1530-5cf0-483e-8cf2-e5ea0abc867d down in Southbound
Jan 22 22:32:02 compute-0 ovn_controller[94850]: 2026-01-22T22:32:02Z|00309|binding|INFO|Removing iface tap108a1530-5c ovn-installed in OVS
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.905 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.913 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:c8:53 10.100.0.12'], port_security=['fa:16:3e:9c:c8:53 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '63595c63-b40f-4491-be23-cc90675eb94e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=108a1530-5cf0-483e-8cf2-e5ea0abc867d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.915 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 108a1530-5cf0-483e-8cf2-e5ea0abc867d in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.917 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.918 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[36acb8a7-01d9-420e-8971-d4090bc0cdeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:02.919 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore
Jan 22 22:32:02 compute-0 nova_compute[182725]: 2026-01-22 22:32:02.930 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:02 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Jan 22 22:32:02 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005c.scope: Consumed 8.363s CPU time.
Jan 22 22:32:02 compute-0 systemd-machined[154006]: Machine qemu-37-instance-0000005c terminated.
Jan 22 22:32:03 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223299]: [NOTICE]   (223303) : haproxy version is 2.8.14-c23fe91
Jan 22 22:32:03 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223299]: [NOTICE]   (223303) : path to executable is /usr/sbin/haproxy
Jan 22 22:32:03 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223299]: [WARNING]  (223303) : Exiting Master process...
Jan 22 22:32:03 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223299]: [ALERT]    (223303) : Current worker (223305) exited with code 143 (Terminated)
Jan 22 22:32:03 compute-0 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223299]: [WARNING]  (223303) : All workers exited. Exiting... (0)
Jan 22 22:32:03 compute-0 systemd[1]: libpod-98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454.scope: Deactivated successfully.
Jan 22 22:32:03 compute-0 podman[223620]: 2026-01-22 22:32:03.070471706 +0000 UTC m=+0.048273936 container died 98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:32:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454-userdata-shm.mount: Deactivated successfully.
Jan 22 22:32:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-a9c1b2b6d4c2124bfdf4955b9871bd481bb70a05562a3af3c6318d07da4fa223-merged.mount: Deactivated successfully.
Jan 22 22:32:03 compute-0 podman[223620]: 2026-01-22 22:32:03.114021913 +0000 UTC m=+0.091824143 container cleanup 98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:32:03 compute-0 systemd[1]: libpod-conmon-98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454.scope: Deactivated successfully.
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.138 182729 INFO nova.virt.libvirt.driver [-] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Instance destroyed successfully.
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.140 182729 DEBUG nova.objects.instance [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid 63595c63-b40f-4491-be23-cc90675eb94e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.156 182729 DEBUG nova.virt.libvirt.vif [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772595314',display_name='tempest-ServerDiskConfigTestJSON-server-772595314',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772595314',id=92,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-cdf405e3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:00Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=63595c63-b40f-4491-be23-cc90675eb94e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.156 182729 DEBUG nova.network.os_vif_util [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "address": "fa:16:3e:9c:c8:53", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap108a1530-5c", "ovs_interfaceid": "108a1530-5cf0-483e-8cf2-e5ea0abc867d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.157 182729 DEBUG nova.network.os_vif_util [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:c8:53,bridge_name='br-int',has_traffic_filtering=True,id=108a1530-5cf0-483e-8cf2-e5ea0abc867d,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap108a1530-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.157 182729 DEBUG os_vif [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:c8:53,bridge_name='br-int',has_traffic_filtering=True,id=108a1530-5cf0-483e-8cf2-e5ea0abc867d,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap108a1530-5c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.159 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.159 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap108a1530-5c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.160 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.162 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.166 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.168 182729 INFO os_vif [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:c8:53,bridge_name='br-int',has_traffic_filtering=True,id=108a1530-5cf0-483e-8cf2-e5ea0abc867d,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap108a1530-5c')
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.169 182729 INFO nova.virt.libvirt.driver [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Deleting instance files /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e_del
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.169 182729 INFO nova.virt.libvirt.driver [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Deletion of /var/lib/nova/instances/63595c63-b40f-4491-be23-cc90675eb94e_del complete
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.271 182729 INFO nova.compute.manager [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.272 182729 DEBUG oslo.service.loopingcall [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.272 182729 DEBUG nova.compute.manager [-] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.273 182729 DEBUG nova.network.neutron [-] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:32:03 compute-0 podman[223666]: 2026-01-22 22:32:03.382119534 +0000 UTC m=+0.234960848 container remove 98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.396 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94ba0c95-307a-4190-a479-32d2c1927cfd]: (4, ('Thu Jan 22 10:32:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454)\n98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454\nThu Jan 22 10:32:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454)\n98c19f3ba2ad0cf35e32353014e7a638b9ec400e1347d9f36883468a91134454\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.398 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5108352d-9712-41a0-92e6-5b0857ebb7f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.399 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.401 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:03 compute-0 kernel: tap354683a7-30: left promiscuous mode
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.416 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.419 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4882b27b-b16b-41f5-98eb-649fd69d5c36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.443 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2cab426a-dad1-4b02-97bb-092c12d7ded3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.445 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f941b15f-688a-40d7-94ab-49a201ae5887]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.465 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f02b4933-0d33-480a-bdf3-3bc871dcd83a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475032, 'reachable_time': 19157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223689, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.471 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:32:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:03.472 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[417ae8cb-8407-4120-9c31-06093e58fb74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.685 182729 DEBUG nova.compute.manager [req-d37df9d8-d2ef-44a6-97bc-3fddf1295487 req-9ec942ba-89ce-4871-a829-f30c1a55a7f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received event network-vif-unplugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.687 182729 DEBUG oslo_concurrency.lockutils [req-d37df9d8-d2ef-44a6-97bc-3fddf1295487 req-9ec942ba-89ce-4871-a829-f30c1a55a7f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.687 182729 DEBUG oslo_concurrency.lockutils [req-d37df9d8-d2ef-44a6-97bc-3fddf1295487 req-9ec942ba-89ce-4871-a829-f30c1a55a7f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.688 182729 DEBUG oslo_concurrency.lockutils [req-d37df9d8-d2ef-44a6-97bc-3fddf1295487 req-9ec942ba-89ce-4871-a829-f30c1a55a7f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.688 182729 DEBUG nova.compute.manager [req-d37df9d8-d2ef-44a6-97bc-3fddf1295487 req-9ec942ba-89ce-4871-a829-f30c1a55a7f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] No waiting events found dispatching network-vif-unplugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.689 182729 DEBUG nova.compute.manager [req-d37df9d8-d2ef-44a6-97bc-3fddf1295487 req-9ec942ba-89ce-4871-a829-f30c1a55a7f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received event network-vif-unplugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:32:03 compute-0 nova_compute[182725]: 2026-01-22 22:32:03.944 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:04 compute-0 nova_compute[182725]: 2026-01-22 22:32:04.974 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'flavor' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.002 182729 DEBUG nova.compute.manager [req-77c7e33d-709e-4447-818e-368116baf0dd req-7ca80d39-8490-4d61-bb01-08e39cf5e678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.002 182729 DEBUG oslo_concurrency.lockutils [req-77c7e33d-709e-4447-818e-368116baf0dd req-7ca80d39-8490-4d61-bb01-08e39cf5e678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.002 182729 DEBUG oslo_concurrency.lockutils [req-77c7e33d-709e-4447-818e-368116baf0dd req-7ca80d39-8490-4d61-bb01-08e39cf5e678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.003 182729 DEBUG oslo_concurrency.lockutils [req-77c7e33d-709e-4447-818e-368116baf0dd req-7ca80d39-8490-4d61-bb01-08e39cf5e678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.003 182729 DEBUG nova.compute.manager [req-77c7e33d-709e-4447-818e-368116baf0dd req-7ca80d39-8490-4d61-bb01-08e39cf5e678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.003 182729 WARNING nova.compute.manager [req-77c7e33d-709e-4447-818e-368116baf0dd req-7ca80d39-8490-4d61-bb01-08e39cf5e678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state stopped and task_state powering-on.
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.004 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'info_cache' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.049 182729 DEBUG nova.network.neutron [-] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.058 182729 DEBUG oslo_concurrency.lockutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.059 182729 DEBUG oslo_concurrency.lockutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.059 182729 DEBUG nova.network.neutron [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.079 182729 INFO nova.compute.manager [-] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Took 1.81 seconds to deallocate network for instance.
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.170 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.171 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.280 182729 DEBUG nova.compute.provider_tree [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.308 182729 DEBUG nova.scheduler.client.report [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.333 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.381 182729 INFO nova.scheduler.client.report [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Deleted allocations for instance 63595c63-b40f-4491-be23-cc90675eb94e
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.503 182729 DEBUG oslo_concurrency.lockutils [None req-c93dd8d9-e815-431f-b5f0-2a2e5d4e43b7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.769 182729 DEBUG nova.compute.manager [req-f95815da-7e12-4621-9736-b6dd530ffae6 req-07c25737-0bc4-41eb-a289-87ade0d0b02b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received event network-vif-deleted-108a1530-5cf0-483e-8cf2-e5ea0abc867d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.817 182729 DEBUG nova.compute.manager [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received event network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.817 182729 DEBUG oslo_concurrency.lockutils [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "63595c63-b40f-4491-be23-cc90675eb94e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.818 182729 DEBUG oslo_concurrency.lockutils [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.818 182729 DEBUG oslo_concurrency.lockutils [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "63595c63-b40f-4491-be23-cc90675eb94e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.818 182729 DEBUG nova.compute.manager [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] No waiting events found dispatching network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:05 compute-0 nova_compute[182725]: 2026-01-22 22:32:05.818 182729 WARNING nova.compute.manager [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Received unexpected event network-vif-plugged-108a1530-5cf0-483e-8cf2-e5ea0abc867d for instance with vm_state deleted and task_state None.
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.867 182729 DEBUG nova.network.neutron [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.889 182729 DEBUG oslo_concurrency.lockutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.927 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance destroyed successfully.
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.928 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'numa_topology' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.942 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.955 182729 DEBUG nova.virt.libvirt.vif [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.956 182729 DEBUG nova.network.os_vif_util [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.957 182729 DEBUG nova.network.os_vif_util [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.957 182729 DEBUG os_vif [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.960 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb56a4401-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.962 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.964 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.967 182729 INFO os_vif [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.976 182729 DEBUG nova.virt.libvirt.driver [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Start _get_guest_xml network_info=[{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.980 182729 WARNING nova.virt.libvirt.driver [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.986 182729 DEBUG nova.virt.libvirt.host [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.988 182729 DEBUG nova.virt.libvirt.host [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.991 182729 DEBUG nova.virt.libvirt.host [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.991 182729 DEBUG nova.virt.libvirt.host [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.993 182729 DEBUG nova.virt.libvirt.driver [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.993 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.994 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.994 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.994 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.994 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.995 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.995 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.995 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.996 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.996 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.996 182729 DEBUG nova.virt.hardware [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:32:06 compute-0 nova_compute[182725]: 2026-01-22 22:32:06.997 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'vcpu_model' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.015 182729 DEBUG nova.virt.libvirt.vif [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.016 182729 DEBUG nova.network.os_vif_util [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.017 182729 DEBUG nova.network.os_vif_util [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.018 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.031 182729 DEBUG nova.virt.libvirt.driver [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <uuid>454ec87b-a45c-40af-8bce-d252eea19620</uuid>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <name>instance-00000057</name>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-256562799</nova:name>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:32:06</nova:creationTime>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         <nova:port uuid="b56a4401-4c89-482a-a347-ca080a879f8f">
Jan 22 22:32:07 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <system>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <entry name="serial">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <entry name="uuid">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </system>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <os>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   </os>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <features>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   </features>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:f0:b5:89"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <target dev="tapb56a4401-4c"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/console.log" append="off"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <video>
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </video>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:32:07 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:32:07 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:32:07 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:32:07 compute-0 nova_compute[182725]: </domain>
Jan 22 22:32:07 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.033 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.099 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.101 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.174 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.177 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'trusted_certs' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.231 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.330 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.332 182729 DEBUG nova.virt.disk.api [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.332 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.428 182729 DEBUG oslo_concurrency.processutils [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.429 182729 DEBUG nova.virt.disk.api [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.430 182729 DEBUG nova.objects.instance [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.444 182729 DEBUG nova.virt.libvirt.vif [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.445 182729 DEBUG nova.network.os_vif_util [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.446 182729 DEBUG nova.network.os_vif_util [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.447 182729 DEBUG os_vif [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.448 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.449 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.450 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.453 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.453 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb56a4401-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.454 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb56a4401-4c, col_values=(('external_ids', {'iface-id': 'b56a4401-4c89-482a-a347-ca080a879f8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:b5:89', 'vm-uuid': '454ec87b-a45c-40af-8bce-d252eea19620'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.456 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 NetworkManager[54954]: <info>  [1769121127.4570] manager: (tapb56a4401-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.457 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.465 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.467 182729 INFO os_vif [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:32:07 compute-0 kernel: tapb56a4401-4c: entered promiscuous mode
Jan 22 22:32:07 compute-0 NetworkManager[54954]: <info>  [1769121127.5730] manager: (tapb56a4401-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 22 22:32:07 compute-0 ovn_controller[94850]: 2026-01-22T22:32:07Z|00310|binding|INFO|Claiming lport b56a4401-4c89-482a-a347-ca080a879f8f for this chassis.
Jan 22 22:32:07 compute-0 ovn_controller[94850]: 2026-01-22T22:32:07Z|00311|binding|INFO|b56a4401-4c89-482a-a347-ca080a879f8f: Claiming fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.575 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 ovn_controller[94850]: 2026-01-22T22:32:07Z|00312|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f ovn-installed in OVS
Jan 22 22:32:07 compute-0 ovn_controller[94850]: 2026-01-22T22:32:07Z|00313|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f up in Southbound
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.586 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.588 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.588 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.590 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.591 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.604 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2a143e9b-ec49-4891-b55a-402505099014]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.605 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.608 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.608 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[91d9d0a0-9294-4198-a9e5-f65cb4f14466]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.609 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d36542ee-c723-4354-9ce7-ddda5ca2a879]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.622 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c1ab24-4be1-4330-9e4c-9406049537da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 systemd-udevd[223721]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:32:07 compute-0 systemd-machined[154006]: New machine qemu-39-instance-00000057.
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.639 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e8054b36-4e42-420f-ac6e-99e8c1c14a7d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 NetworkManager[54954]: <info>  [1769121127.6466] device (tapb56a4401-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:32:07 compute-0 NetworkManager[54954]: <info>  [1769121127.6473] device (tapb56a4401-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:32:07 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-00000057.
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.673 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd676f3-018f-4c1a-940c-b33d16e7c230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.681 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d83fd1a4-b8c4-430f-95c9-98f6d89670be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 NetworkManager[54954]: <info>  [1769121127.6842] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Jan 22 22:32:07 compute-0 systemd-udevd[223724]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.718 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f207dad7-13b9-4023-8565-8030e0f0ad10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.721 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[df1d5dc6-a144-4ae7-9c01-0c0fc93bb91f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 NetworkManager[54954]: <info>  [1769121127.7501] device (tape65877e5-00): carrier: link connected
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.757 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6f005daf-7cad-4096-af73-ffd8d3e43e72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.774 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1bde9915-7bb4-44fe-8063-a497be675a5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476535, 'reachable_time': 43636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223752, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.792 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[19fe5b7b-0e53-4c87-b211-30f7b76b5499]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 476535, 'tstamp': 476535}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223753, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.809 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c6159415-e5a2-41e0-b3ef-84f378c22841]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476535, 'reachable_time': 43636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223754, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.853 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5ad76c-45b7-4c03-a583-84eadb6571e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.882 182729 DEBUG nova.compute.manager [req-26483a69-fabf-4c03-8df2-d1ffd481fe21 req-ea7b2ec2-e8b1-4d63-8fe0-9b7553038804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.883 182729 DEBUG oslo_concurrency.lockutils [req-26483a69-fabf-4c03-8df2-d1ffd481fe21 req-ea7b2ec2-e8b1-4d63-8fe0-9b7553038804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.883 182729 DEBUG oslo_concurrency.lockutils [req-26483a69-fabf-4c03-8df2-d1ffd481fe21 req-ea7b2ec2-e8b1-4d63-8fe0-9b7553038804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.883 182729 DEBUG oslo_concurrency.lockutils [req-26483a69-fabf-4c03-8df2-d1ffd481fe21 req-ea7b2ec2-e8b1-4d63-8fe0-9b7553038804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.884 182729 DEBUG nova.compute.manager [req-26483a69-fabf-4c03-8df2-d1ffd481fe21 req-ea7b2ec2-e8b1-4d63-8fe0-9b7553038804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.884 182729 WARNING nova.compute.manager [req-26483a69-fabf-4c03-8df2-d1ffd481fe21 req-ea7b2ec2-e8b1-4d63-8fe0-9b7553038804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state stopped and task_state powering-on.
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.909 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 454ec87b-a45c-40af-8bce-d252eea19620 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.910 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121127.9092324, 454ec87b-a45c-40af-8bce-d252eea19620 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.910 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Resumed (Lifecycle Event)
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.914 182729 DEBUG nova.compute.manager [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.920 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance rebooted successfully.
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.920 182729 DEBUG nova.compute.manager [None req-4da44549-d901-4fc5-babc-95ef43a39ca3 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.933 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[678d63a0-3175-4aec-9412-3081dd7e63b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.935 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.935 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.935 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.936 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 NetworkManager[54954]: <info>  [1769121127.9387] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Jan 22 22:32:07 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.943 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.943 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 ovn_controller[94850]: 2026-01-22T22:32:07Z|00314|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.947 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.947 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.948 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1a998d7b-336d-4d83-8e0a-ad21fcc078bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.949 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:32:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:07.950 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.957 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.976 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During sync_power_state the instance has a pending task (powering-on). Skip.
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.977 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121127.9102352, 454ec87b-a45c-40af-8bce-d252eea19620 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:07 compute-0 nova_compute[182725]: 2026-01-22 22:32:07.977 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Started (Lifecycle Event)
Jan 22 22:32:08 compute-0 nova_compute[182725]: 2026-01-22 22:32:08.001 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:08 compute-0 nova_compute[182725]: 2026-01-22 22:32:08.010 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:08 compute-0 podman[223793]: 2026-01-22 22:32:08.405035894 +0000 UTC m=+0.061383937 container create 86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:32:08 compute-0 systemd[1]: Started libpod-conmon-86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692.scope.
Jan 22 22:32:08 compute-0 podman[223793]: 2026-01-22 22:32:08.373232173 +0000 UTC m=+0.029580236 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:32:08 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:32:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a2b7455e8b4553992d821de71eefd436ef75c65193fbc253a68f11ecc65be6a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:32:08 compute-0 podman[223793]: 2026-01-22 22:32:08.512945141 +0000 UTC m=+0.169293264 container init 86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 22:32:08 compute-0 podman[223793]: 2026-01-22 22:32:08.518611304 +0000 UTC m=+0.174959377 container start 86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:32:08 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [NOTICE]   (223813) : New worker (223815) forked
Jan 22 22:32:08 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [NOTICE]   (223813) : Loading success.
Jan 22 22:32:08 compute-0 nova_compute[182725]: 2026-01-22 22:32:08.947 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.111 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000058', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'hostId': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.115 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '454ec87b-a45c-40af-8bce-d252eea19620', 'name': 'tempest-ServerActionsTestJSON-server-256562799', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000057', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '301c97a097c64afd8d55adb73fdd8cce', 'user_id': '97ae504d8c4f43529c360266766791d0', 'hostId': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.144 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.145 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.172 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.173 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91fe67d2-1a66-4ceb-99d5-f13247ac26ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.115525', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '302db9e4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '8d9f3fa484638b6bead0efef802fe46afcb88eda585bab6d94df59b0a54d0b19'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.115525', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '302dc664-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '8742dc0c4daa1467c7b8eefd2bbc946503b3e15adc60eb3848aa15caa59d9331'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.115525', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3031da10-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': 'a09646deadbfad12de46d7a5ac9db5250afa64809cbd6c2cf37f4334db803970'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.115525', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3031f324-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': '9c49eef0191ebc07d7e754e117294f3a244f5ee842650acebdc62cdc505e51b8'}]}, 'timestamp': '2026-01-22 22:32:09.173732', '_unique_id': '29c5853f62f349e38a6c1b056c47d8bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.175 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.182 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 815ebbb8-e2c4-4f72-8048-df7c53f1439a / tap8e8cfdc3-60 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.182 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.187 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 454ec87b-a45c-40af-8bce-d252eea19620 / tapb56a4401-4c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.187 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94b56d8e-88d6-4482-ba41-a185908e0359', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.177909', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '3033710e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': '995ee1308d88d2eb7096bc8bea879781e65f5866cd9ba310ec6b2ac8a78184c5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.177909', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '30342d7e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '9ff4a8e96d626c93872afd75f3d068b995ab77a0d278d48226ef6aefa24f52af'}]}, 'timestamp': '2026-01-22 22:32:09.188500', '_unique_id': '63a117f45d2a4a33ad508c643fdd7843'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.190 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.192 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.192 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09546e61-a623-41ce-899e-53a55c117b2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.192185', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '3034d922-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': 'f177ae3e1cdb7c3de718ee213a4759ba819e91242a1ba9036cdd32fb6415f428'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.192185', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '3034f90c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '25c5dac738399e7479f38d4b7a17347514019d8658c0cabdf96b161cbc564c28'}]}, 'timestamp': '2026-01-22 22:32:09.193636', '_unique_id': '99875dae8c01443a96e02c4007a23588'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.195 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.197 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.197 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3612f55-95f6-4063-a1e5-8a46993c7e4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.197113', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '30359b0a-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': '8dc88625c99e53b3da38f3637fc592e4221c9ba22744f1535080403957134b0e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.197113', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '3035b4fa-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '3aaabe0406edc8ef235052d510026d70c107be41b7e91007f01b1bb805fd7f56'}]}, 'timestamp': '2026-01-22 22:32:09.198368', '_unique_id': '6c383deb71124144a6a3e9bd4ec7f219'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.199 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.201 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.202 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>]
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.202 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.read.latency volume: 180305679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.203 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.read.latency volume: 457062 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.204 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.read.latency volume: 139745709 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.204 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.read.latency volume: 294777 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fda5217b-04aa-46f0-a54b-b0a05272f394', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 180305679, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.202936', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '30367cdc-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': 'b53dc6ba31c6eeb5d7cfc6ffc0c07b11c6bd782e44444c4529e0de6a7f067dca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 457062, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.202936', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '303694c4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '154c0b522a34bc66f444d8a19474f1b579593b3eb258f85c133c4cfd4d38bc85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 139745709, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.202936', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3036a8ec-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': 'c3adb237f3aa51182623836e8ce9fbbe43c823c0c7aa70484ede6ecb1e7c26d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 294777, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.202936', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3036c0de-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': 'e9afe1625bc612a7ac40c796360247095f341cc32d5f8f4f9094674281c9bf94'}]}, 'timestamp': '2026-01-22 22:32:09.205189', '_unique_id': '7372406702594cef859300309b04a1a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.206 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.208 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.208 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>]
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.208 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.208 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>]
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.221 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.222 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.245 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.246 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9685d1-1fc4-4315-a96e-df6b199a51e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.209431', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '30396ed8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.86675039, 'message_signature': '2f2755bc0fe8070ad3fdeaee8ed39e7c1b9d979e58bc0e6b8d833e7666ce1885'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.209431', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '30397f04-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.86675039, 'message_signature': '49e1f02512833008f9c27bc1f6f13f8aa561f28b8b07230a3ef4cab164db8aaf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.209431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '303d0c32-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.880335723, 'message_signature': 'ae53f04f9dc58b3757c803bd0b749dd61a5225a22eeea064aca508080999a23c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.209431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '303d181c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.880335723, 'message_signature': '99c051cdb6cb48ddd6778608310531c9f27cd316a408b772bfd6e5cf132c8a78'}]}, 'timestamp': '2026-01-22 22:32:09.246639', '_unique_id': 'c661ace6d69c4bd398936f21a9dcb758'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.247 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.250 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.267 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/cpu volume: 8990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.282 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/cpu volume: 1270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaa9b505-ee97-4829-9118-c475f2bf9a7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8990000000, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'timestamp': '2026-01-22T22:32:09.250676', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '304058d8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.924801472, 'message_signature': '8037106905ba8aa04d1610960d2f2053252fe409677d72b54fa520b497f8d9ee'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1270000000, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'timestamp': '2026-01-22T22:32:09.250676', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '30428ab8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.939206985, 'message_signature': '65d04b820bdacd986be048334b8138c7ccf3f894f8450618a83c53629eef6039'}]}, 'timestamp': '2026-01-22 22:32:09.282330', '_unique_id': '22882bc82a814d4eb06fd7ccc9720f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.283 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.288 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.289 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9770535b-7d60-4996-904b-13859d0c3fef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.288590', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '30439192-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': '7635926ccb632286f159bf728eb70d3ab4262435ef361ba001b0cac6e824e076'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.288590', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '3043a3e4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '016f9033f9167646fd842b3aa1a3f69bf6ce62ab9ee07ba6dc430a5df2fb73b0'}]}, 'timestamp': '2026-01-22 22:32:09.289638', '_unique_id': '413e520c396149c994941724368d8294'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.290 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.296 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.296 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.296 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 815ebbb8-e2c4-4f72-8048-df7c53f1439a: ceilometer.compute.pollsters.NoVolumeException
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.296 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.297 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 454ec87b-a45c-40af-8bce-d252eea19620: ceilometer.compute.pollsters.NoVolumeException
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.297 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.297 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.298 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.298 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.298 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96897964-9e39-46aa-8864-a291ec8066be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.297510', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3044e9ca-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.86675039, 'message_signature': '69ffa16c1a37856aa5f89622447caefd1b66aa2bb7541149a50c51271a5aa483'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.297510', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3044fc94-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.86675039, 'message_signature': '6b156be4912af94d41d2342644156281e268b4a7ec2e4e698b6c8d63f9ee0a84'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.297510', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '30450c98-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.880335723, 'message_signature': '53ba59a8c1935c9120a29878ccf4997d15583617af5e5a85ae45328420f2e752'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.297510', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '30451dd2-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.880335723, 'message_signature': '758927e7ab37557ebb7ac89ca9f1e121ae4f29053b4971d9796ec1ebd8ba56c6'}]}, 'timestamp': '2026-01-22 22:32:09.299291', '_unique_id': 'd6a5641e200d48089614600fff51b395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.301 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.305 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.305 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.306 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-871599483>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-256562799>]
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.306 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.307 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f12ac56-842e-4d63-8339-c128ce493251', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.306555', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '30465134-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': '2e80550ce5635a411108dacddd4a724e1a8441b0af21814768b6457189aebb80'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.306555', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '30466200-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '162f054e4ef8e0e403c0c34e2e24f6633fabd2279c23c3f18dd8c4c09ea60523'}]}, 'timestamp': '2026-01-22 22:32:09.307550', '_unique_id': '835d447393ac4f42ab132f9fd868530d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.308 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.312 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.312 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '000fb18f-db07-44e0-8de7-18f747718c40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.311938', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '304722f8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': 'a668b4e03f36d319ccb176b001b9342f41a6feda044b2c0a13e3a6b129ebbc00'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.311938', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '30473626-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '159a5028908ff8898b0a36ae02e8d4bd2a9d817d98a6aa319be2e3d5a9776dd8'}]}, 'timestamp': '2026-01-22 22:32:09.312983', '_unique_id': 'ebd05639e16a4df5862109d3a055c8de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.313 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.317 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.317 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a20e1483-a8b9-4db0-ba5f-d2c86eebd30c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.317301', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '3047f494-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': 'b1f1a530d9e7606eddfd761bc048676d24d246199470e30cc62b5a5d0d5d0261'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.317301', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '30480650-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '41a14a6d97ce423b28995b7542978f99e17da396fce719b8ceb1c8d8090b3fb6'}]}, 'timestamp': '2026-01-22 22:32:09.318310', '_unique_id': '86fa087f55094104a8688d1b1856a8f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.318 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.322 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.323 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '334e9f9b-bc92-4422-88f8-10ec017779ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.322592', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '3048c3e2-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': '10042d92bdd649a2dd16c78e4abe93220b817545bee355725826693a325e1dfc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.322592', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '3048d49a-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': '0b56bc2ffd623ef0f290347962ecce7c98acb60b4904190c8f3f8191d7d0b6de'}]}, 'timestamp': '2026-01-22 22:32:09.323588', '_unique_id': 'fb141ea3895441a18e06800888a44ddb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.324 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.328 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.328 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.328 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.329 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92c74977-18b2-4d9b-b7f5-b2fdf0f85120', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.328066', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '30499560-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '75a47bb18a5b4a6d0165fc43a81cede89969791fd754b0723b455c7dd3d27091'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.328066', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3049a6d6-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': 'e79230e3076789eb043b11d3bb0a590e55d4160dcf13a888c773653f1426508f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.328066', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3049b87e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': '5b3581ac36041241772fe8ff24aeeef722855997201b88d39e430c128cfec768'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.328066', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3049c800-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': 'ba417240de69062874ae213c9e07dc612ab77ddccb841b77e2aecebf9ccefa10'}]}, 'timestamp': '2026-01-22 22:32:09.329879', '_unique_id': 'b85069bf95b649eeb404ff8b156493e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.330 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.334 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.335 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.335 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.336 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8740065a-0898-48a0-9be8-fb07cb96519a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.334578', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304a992e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '3162e7a3e438350318602147bd80f71b41ecbfb06a183f1aba6580d2e4fc868c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.334578', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304aaa90-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '7438a354b602c6e3a1c7726d4b2f050a287675c9f36f108e61542812d2678a7e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.334578', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304abac6-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': '43182d50b071b57470a2a5fde83164decad555828e47789fd0287dbfc5e5993f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.334578', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304acb56-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': '2d5c959af26c9f0dfe13a3e09e11c30a2b027a4bb8ab9d20ef19b371bf253119'}]}, 'timestamp': '2026-01-22 22:32:09.336452', '_unique_id': '9d932c0ad6da4d66a80cdc0225c8ec50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.337 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.339 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.340 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19210602-1a0a-4885-8530-d79970af6ccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.339500', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '304b64a8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': '9da64f27d89e305e9d159511fb266d1465a0654653ad5456060dde1cb785e01b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.339500', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '304b7ef2-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': 'b6a8133abb1b8e658cc76d6177208f1831cf05e0dd45da124832851cacc02e89'}]}, 'timestamp': '2026-01-22 22:32:09.341149', '_unique_id': 'f62eed4a823a455cb23daa2abe4905ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.342 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.344 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.344 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59dd0ee2-429e-4965-942f-364173c02119', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-00000058-815ebbb8-e2c4-4f72-8048-df7c53f1439a-tap8e8cfdc3-60', 'timestamp': '2026-01-22T22:32:09.344420', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'tap8e8cfdc3-60', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:60:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e8cfdc3-60'}, 'message_id': '304c13a8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.835240137, 'message_signature': 'e82534a08ba24c63f36eeed9a2194e6a2ad1b2d05e6b10b7497b9d0f89fea6a1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-00000057-454ec87b-a45c-40af-8bce-d252eea19620-tapb56a4401-4c', 'timestamp': '2026-01-22T22:32:09.344420', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'tapb56a4401-4c', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f0:b5:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb56a4401-4c'}, 'message_id': '304c274e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.840869619, 'message_signature': 'e88665d16f0d1383e422466d63cf7f153baf3e8d1db21910d9057f6e345e6f47'}]}, 'timestamp': '2026-01-22 22:32:09.345433', '_unique_id': '67ad085ee1cc493e8d58bde506c9378e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.346 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.347 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.348 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.348 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.349 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba7d2a4e-8732-42fb-902f-c89ebd54f647', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.347880', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304c99a4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.86675039, 'message_signature': 'a2c713f1a5d5ebeb357fb852835310364fbd75231795c9499773bf420b1092c3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.347880', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304caa52-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.86675039, 'message_signature': '45729c7ed1f6b32a49beb776dda37eaf88e48e4c5e365585769492de61262589'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.347880', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304cbcb8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.880335723, 'message_signature': '7859334ef0c85a524d69205791b33cc23f8df63658bcb7ff4191446638bb149a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.347880', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304ccc1c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.880335723, 'message_signature': 'f2f0bb032ad2182f9c4e71651e372b2f88d3a3981081d459ab0ef350cdc7c94e'}]}, 'timestamp': '2026-01-22 22:32:09.349627', '_unique_id': '5e019f76e14643f7adf3c33583a61e3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.350 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.351 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.352 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.352 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.353 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.353 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83bf5d58-4405-44f7-b239-99c3f1773870', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.352082', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304d3daa-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '238c188a765c4338d15fabc8ec43fc252a66468c675c7d177051b1ba935fde78'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.352082', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304d5060-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '110765107868144260ac67df212bae0f2f56dc6b2e22c6362058e88fa711cc34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.352082', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304d6082-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': 'cdf9b7b86772689b82429d57fbe3e7b30fa4c6bf3afafeebc1ee8459ab51d8eb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.352082', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304d6fdc-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': '6ab4eeed3bf298a1c765adba88ad050156ce971f01bc376d6b090a911302e813'}]}, 'timestamp': '2026-01-22 22:32:09.353855', '_unique_id': '8e9cef0d389f4a4695eada57034a7c53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.356 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.356 12 DEBUG ceilometer.compute.pollsters [-] 815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.357 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.357 12 DEBUG ceilometer.compute.pollsters [-] 454ec87b-a45c-40af-8bce-d252eea19620/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2fe52a8-9921-427d-9269-5627e863038a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-vda', 'timestamp': '2026-01-22T22:32:09.356257', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304de034-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '8881ff59ca2a0b2fd1c1a754e2f40b2d289c4578c4b050dbaf9215724069b0ec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a-sda', 'timestamp': '2026-01-22T22:32:09.356257', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-871599483', 'name': 'instance-00000058', 'instance_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'instance_type': 'm1.nano', 'host': 'd2402351f9be77a1f8c384d9b21fdf9bad7d97dcd61777bf8945e74c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304df2cc-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.772796615, 'message_signature': '80686a89be7d0ed20a82ee39415c748767b020e43e761511e53967a015532270'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-vda', 'timestamp': '2026-01-22T22:32:09.356257', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '304e028a-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': '4bd34bfd9978fb605c564b0e140f10f3e5c941e6361b61d2ae42bd1a6ec5ec7d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': '454ec87b-a45c-40af-8bce-d252eea19620-sda', 'timestamp': '2026-01-22T22:32:09.356257', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-256562799', 'name': 'instance-00000057', 'instance_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '304e1180-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 4766.803496618, 'message_signature': '7f3a276e9e7b0f458e048eff48322caf7bdad77e4a38fa1c2fb34569aeb79397'}]}, 'timestamp': '2026-01-22 22:32:09.359290', '_unique_id': 'd40b10ac30b24e4ca4dfd1df0604ad4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:32:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:32:09.360 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:32:10 compute-0 nova_compute[182725]: 2026-01-22 22:32:10.052 182729 DEBUG nova.compute.manager [req-01ea99d8-5713-41f4-a108-5de6fbeeda6e req-6020f96b-31a8-4a53-b1f4-c628bee1c2b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:10 compute-0 nova_compute[182725]: 2026-01-22 22:32:10.052 182729 DEBUG oslo_concurrency.lockutils [req-01ea99d8-5713-41f4-a108-5de6fbeeda6e req-6020f96b-31a8-4a53-b1f4-c628bee1c2b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:10 compute-0 nova_compute[182725]: 2026-01-22 22:32:10.053 182729 DEBUG oslo_concurrency.lockutils [req-01ea99d8-5713-41f4-a108-5de6fbeeda6e req-6020f96b-31a8-4a53-b1f4-c628bee1c2b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:10 compute-0 nova_compute[182725]: 2026-01-22 22:32:10.053 182729 DEBUG oslo_concurrency.lockutils [req-01ea99d8-5713-41f4-a108-5de6fbeeda6e req-6020f96b-31a8-4a53-b1f4-c628bee1c2b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:10 compute-0 nova_compute[182725]: 2026-01-22 22:32:10.053 182729 DEBUG nova.compute.manager [req-01ea99d8-5713-41f4-a108-5de6fbeeda6e req-6020f96b-31a8-4a53-b1f4-c628bee1c2b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:10 compute-0 nova_compute[182725]: 2026-01-22 22:32:10.053 182729 WARNING nova.compute.manager [req-01ea99d8-5713-41f4-a108-5de6fbeeda6e req-6020f96b-31a8-4a53-b1f4-c628bee1c2b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:32:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:12.439 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:12.440 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:12.441 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:12 compute-0 nova_compute[182725]: 2026-01-22 22:32:12.457 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:12 compute-0 ovn_controller[94850]: 2026-01-22T22:32:12Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:60:70 10.100.0.14
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.069 182729 INFO nova.compute.manager [None req-1316b3f9-2be1-436a-b743-ef0e54ccf8f2 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Pausing
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.070 182729 DEBUG nova.objects.instance [None req-1316b3f9-2be1-436a-b743-ef0e54ccf8f2 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'flavor' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:13 compute-0 ovn_controller[94850]: 2026-01-22T22:32:13Z|00315|binding|INFO|Releasing lport 0a1fd4a8-b506-4c9d-9846-1c0ab542e465 from this chassis (sb_readonly=0)
Jan 22 22:32:13 compute-0 ovn_controller[94850]: 2026-01-22T22:32:13Z|00316|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.130 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121133.129915, 454ec87b-a45c-40af-8bce-d252eea19620 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.130 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Paused (Lifecycle Event)
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.133 182729 DEBUG nova.compute.manager [None req-1316b3f9-2be1-436a-b743-ef0e54ccf8f2 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.149 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.185 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.189 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.212 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 22 22:32:13 compute-0 nova_compute[182725]: 2026-01-22 22:32:13.950 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:14 compute-0 nova_compute[182725]: 2026-01-22 22:32:14.912 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:16 compute-0 podman[223829]: 2026-01-22 22:32:16.164448329 +0000 UTC m=+0.092964402 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.757 182729 INFO nova.compute.manager [None req-0f5e62cd-de05-4b65-83c7-e3561c79f8e1 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Unpausing
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.758 182729 DEBUG nova.objects.instance [None req-0f5e62cd-de05-4b65-83c7-e3561c79f8e1 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'flavor' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.790 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121136.7901657, 454ec87b-a45c-40af-8bce-d252eea19620 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.790 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Resumed (Lifecycle Event)
Jan 22 22:32:16 compute-0 virtqemud[182297]: argument unsupported: QEMU guest agent is not configured
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.794 182729 DEBUG nova.virt.libvirt.guest [None req-0f5e62cd-de05-4b65-83c7-e3561c79f8e1 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.794 182729 DEBUG nova.compute.manager [None req-0f5e62cd-de05-4b65-83c7-e3561c79f8e1 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.814 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.817 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.845 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During sync_power_state the instance has a pending task (unpausing). Skip.
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.916 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.916 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.917 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:32:16 compute-0 nova_compute[182725]: 2026-01-22 22:32:16.993 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.059 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.060 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.126 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.132 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.195 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.196 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.254 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.418 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.420 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5322MB free_disk=73.30965042114258GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.420 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.420 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.461 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.494 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 454ec87b-a45c-40af-8bce-d252eea19620 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.495 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 815ebbb8-e2c4-4f72-8048-df7c53f1439a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.495 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.495 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.554 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.567 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.592 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.592 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.654 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.654 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.655 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.655 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.655 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.666 182729 INFO nova.compute.manager [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Terminating instance
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.675 182729 DEBUG nova.compute.manager [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:32:17 compute-0 kernel: tap8e8cfdc3-60 (unregistering): left promiscuous mode
Jan 22 22:32:17 compute-0 NetworkManager[54954]: <info>  [1769121137.7122] device (tap8e8cfdc3-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:32:17 compute-0 ovn_controller[94850]: 2026-01-22T22:32:17Z|00317|binding|INFO|Releasing lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 from this chassis (sb_readonly=0)
Jan 22 22:32:17 compute-0 ovn_controller[94850]: 2026-01-22T22:32:17Z|00318|binding|INFO|Setting lport 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 down in Southbound
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.725 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:17 compute-0 ovn_controller[94850]: 2026-01-22T22:32:17Z|00319|binding|INFO|Removing iface tap8e8cfdc3-60 ovn-installed in OVS
Jan 22 22:32:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:17.734 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:60:70 10.100.0.14'], port_security=['fa:16:3e:b3:60:70 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '815ebbb8-e2c4-4f72-8048-df7c53f1439a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f234f62b-5371-4527-94e7-91cf5da3055e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f3b57348-3994-471b-bd73-e78507392f5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2c3e5cc-ee0d-48e7-8eab-3e968c7ed6fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=8e8cfdc3-60bc-4edf-89ba-c53573ea3141) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:17.736 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 8e8cfdc3-60bc-4edf-89ba-c53573ea3141 in datapath f234f62b-5371-4527-94e7-91cf5da3055e unbound from our chassis
Jan 22 22:32:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:17.737 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f234f62b-5371-4527-94e7-91cf5da3055e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:32:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:17.738 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5f1045-732f-4cbc-b03d-c1716e929318]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:17.739 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e namespace which is not needed anymore
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.747 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:17 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000058.scope: Deactivated successfully.
Jan 22 22:32:17 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000058.scope: Consumed 13.074s CPU time.
Jan 22 22:32:17 compute-0 systemd-machined[154006]: Machine qemu-38-instance-00000058 terminated.
Jan 22 22:32:17 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [NOTICE]   (223479) : haproxy version is 2.8.14-c23fe91
Jan 22 22:32:17 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [NOTICE]   (223479) : path to executable is /usr/sbin/haproxy
Jan 22 22:32:17 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [WARNING]  (223479) : Exiting Master process...
Jan 22 22:32:17 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [WARNING]  (223479) : Exiting Master process...
Jan 22 22:32:17 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [ALERT]    (223479) : Current worker (223481) exited with code 143 (Terminated)
Jan 22 22:32:17 compute-0 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223475]: [WARNING]  (223479) : All workers exited. Exiting... (0)
Jan 22 22:32:17 compute-0 systemd[1]: libpod-4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7.scope: Deactivated successfully.
Jan 22 22:32:17 compute-0 conmon[223475]: conmon 4805ab881f2693cdc23e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7.scope/container/memory.events
Jan 22 22:32:17 compute-0 podman[223889]: 2026-01-22 22:32:17.880740015 +0000 UTC m=+0.047458376 container died 4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 22:32:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7-userdata-shm.mount: Deactivated successfully.
Jan 22 22:32:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-89ae5f6b14701300d3a0edbf16ef68e9f059c1e8207c1da18d9e251b811e78af-merged.mount: Deactivated successfully.
Jan 22 22:32:17 compute-0 podman[223889]: 2026-01-22 22:32:17.938583572 +0000 UTC m=+0.105301933 container cleanup 4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:32:17 compute-0 systemd[1]: libpod-conmon-4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7.scope: Deactivated successfully.
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.952 182729 INFO nova.virt.libvirt.driver [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Instance destroyed successfully.
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.955 182729 DEBUG nova.objects.instance [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'resources' on Instance uuid 815ebbb8-e2c4-4f72-8048-df7c53f1439a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.969 182729 DEBUG nova.virt.libvirt.vif [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-871599483',display_name='tempest-ListServerFiltersTestJSON-instance-871599483',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-871599483',id=88,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-34audv8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:59Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=815ebbb8-e2c4-4f72-8048-df7c53f1439a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.970 182729 DEBUG nova.network.os_vif_util [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "address": "fa:16:3e:b3:60:70", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e8cfdc3-60", "ovs_interfaceid": "8e8cfdc3-60bc-4edf-89ba-c53573ea3141", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.971 182729 DEBUG nova.network.os_vif_util [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.971 182729 DEBUG os_vif [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.972 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.973 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e8cfdc3-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.974 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.976 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.978 182729 INFO os_vif [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:60:70,bridge_name='br-int',has_traffic_filtering=True,id=8e8cfdc3-60bc-4edf-89ba-c53573ea3141,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e8cfdc3-60')
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.979 182729 INFO nova.virt.libvirt.driver [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Deleting instance files /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a_del
Jan 22 22:32:17 compute-0 nova_compute[182725]: 2026-01-22 22:32:17.979 182729 INFO nova.virt.libvirt.driver [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Deletion of /var/lib/nova/instances/815ebbb8-e2c4-4f72-8048-df7c53f1439a_del complete
Jan 22 22:32:18 compute-0 podman[223937]: 2026-01-22 22:32:18.021778767 +0000 UTC m=+0.048171574 container remove 4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.027 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[68da14fc-af1a-4c74-b40a-ee27bb2e5c77]: (4, ('Thu Jan 22 10:32:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e (4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7)\n4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7\nThu Jan 22 10:32:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e (4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7)\n4805ab881f2693cdc23e786d528ca28803634e977a3372a52438248bbb98b1f7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.029 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1d1c77-24e1-4a1b-93a7-5e01f3d658bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.031 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf234f62b-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.032 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:18 compute-0 kernel: tapf234f62b-50: left promiscuous mode
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.045 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.049 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[475113ba-72ef-4605-ad85-51f639d08106]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.055 182729 INFO nova.compute.manager [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.056 182729 DEBUG oslo.service.loopingcall [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.056 182729 DEBUG nova.compute.manager [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.057 182729 DEBUG nova.network.neutron [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.063 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[73a921b6-9b45-4705-8ea4-1ba72cdffe33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.064 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[af8c3ab3-15a0-42a0-9523-82658f1aa6a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.080 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ecae398f-9a47-4298-8e4d-efbdc0ccdaee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475673, 'reachable_time': 36286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223952, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:18 compute-0 systemd[1]: run-netns-ovnmeta\x2df234f62b\x2d5371\x2d4527\x2d94e7\x2d91cf5da3055e.mount: Deactivated successfully.
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.086 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:32:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:18.086 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfea851-1ab2-4d60-979b-cecf0ddb3db9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.137 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121123.1358786, 63595c63-b40f-4491-be23-cc90675eb94e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.137 182729 INFO nova.compute.manager [-] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] VM Stopped (Lifecycle Event)
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.158 182729 DEBUG nova.compute.manager [None req-917ea95c-5951-4584-840e-9c6409662cf3 - - - - - -] [instance: 63595c63-b40f-4491-be23-cc90675eb94e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.796 182729 DEBUG nova.network.neutron [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.821 182729 INFO nova.compute.manager [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Took 0.76 seconds to deallocate network for instance.
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.840 182729 DEBUG nova.compute.manager [req-85bd7bfa-2077-4188-86d1-cca6c6c13762 req-47526613-eea4-4fa8-b5cc-37cef80e8e35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-unplugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.840 182729 DEBUG oslo_concurrency.lockutils [req-85bd7bfa-2077-4188-86d1-cca6c6c13762 req-47526613-eea4-4fa8-b5cc-37cef80e8e35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.841 182729 DEBUG oslo_concurrency.lockutils [req-85bd7bfa-2077-4188-86d1-cca6c6c13762 req-47526613-eea4-4fa8-b5cc-37cef80e8e35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.841 182729 DEBUG oslo_concurrency.lockutils [req-85bd7bfa-2077-4188-86d1-cca6c6c13762 req-47526613-eea4-4fa8-b5cc-37cef80e8e35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.841 182729 DEBUG nova.compute.manager [req-85bd7bfa-2077-4188-86d1-cca6c6c13762 req-47526613-eea4-4fa8-b5cc-37cef80e8e35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] No waiting events found dispatching network-vif-unplugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.842 182729 DEBUG nova.compute.manager [req-85bd7bfa-2077-4188-86d1-cca6c6c13762 req-47526613-eea4-4fa8-b5cc-37cef80e8e35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-unplugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.913 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.914 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.952 182729 DEBUG nova.compute.manager [req-5ebb4597-00f5-43a7-8534-54031ebd9d8f req-dba38ac1-f401-447b-9e27-95212bd9b8fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-deleted-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:18 compute-0 nova_compute[182725]: 2026-01-22 22:32:18.953 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:19 compute-0 nova_compute[182725]: 2026-01-22 22:32:19.018 182729 DEBUG nova.compute.provider_tree [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:32:19 compute-0 nova_compute[182725]: 2026-01-22 22:32:19.035 182729 DEBUG nova.scheduler.client.report [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:32:19 compute-0 nova_compute[182725]: 2026-01-22 22:32:19.054 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:19 compute-0 nova_compute[182725]: 2026-01-22 22:32:19.088 182729 INFO nova.scheduler.client.report [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Deleted allocations for instance 815ebbb8-e2c4-4f72-8048-df7c53f1439a
Jan 22 22:32:19 compute-0 nova_compute[182725]: 2026-01-22 22:32:19.169 182729 DEBUG oslo_concurrency.lockutils [None req-4c09f464-897e-44e3-83bc-36e77117f770 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:19 compute-0 podman[223954]: 2026-01-22 22:32:19.171761534 +0000 UTC m=+0.094196033 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Jan 22 22:32:19 compute-0 podman[223953]: 2026-01-22 22:32:19.18708657 +0000 UTC m=+0.115992832 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 22 22:32:19 compute-0 nova_compute[182725]: 2026-01-22 22:32:19.594 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:19 compute-0 nova_compute[182725]: 2026-01-22 22:32:19.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:20 compute-0 nova_compute[182725]: 2026-01-22 22:32:20.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:20 compute-0 nova_compute[182725]: 2026-01-22 22:32:20.892 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:32:20 compute-0 nova_compute[182725]: 2026-01-22 22:32:20.892 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.131 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.132 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.132 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.132 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:21 compute-0 ovn_controller[94850]: 2026-01-22T22:32:21Z|00320|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.314 182729 DEBUG nova.compute.manager [req-205e45e6-ab4f-48a0-a856-b6f9823898f5 req-37ae70dc-6435-41d2-b160-d0c3cf94297b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.314 182729 DEBUG oslo_concurrency.lockutils [req-205e45e6-ab4f-48a0-a856-b6f9823898f5 req-37ae70dc-6435-41d2-b160-d0c3cf94297b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.315 182729 DEBUG oslo_concurrency.lockutils [req-205e45e6-ab4f-48a0-a856-b6f9823898f5 req-37ae70dc-6435-41d2-b160-d0c3cf94297b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.316 182729 DEBUG oslo_concurrency.lockutils [req-205e45e6-ab4f-48a0-a856-b6f9823898f5 req-37ae70dc-6435-41d2-b160-d0c3cf94297b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "815ebbb8-e2c4-4f72-8048-df7c53f1439a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.316 182729 DEBUG nova.compute.manager [req-205e45e6-ab4f-48a0-a856-b6f9823898f5 req-37ae70dc-6435-41d2-b160-d0c3cf94297b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] No waiting events found dispatching network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.316 182729 WARNING nova.compute.manager [req-205e45e6-ab4f-48a0-a856-b6f9823898f5 req-37ae70dc-6435-41d2-b160-d0c3cf94297b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Received unexpected event network-vif-plugged-8e8cfdc3-60bc-4edf-89ba-c53573ea3141 for instance with vm_state deleted and task_state None.
Jan 22 22:32:21 compute-0 nova_compute[182725]: 2026-01-22 22:32:21.317 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:22 compute-0 nova_compute[182725]: 2026-01-22 22:32:22.975 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:23 compute-0 nova_compute[182725]: 2026-01-22 22:32:23.564 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:23 compute-0 nova_compute[182725]: 2026-01-22 22:32:23.599 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:32:23 compute-0 nova_compute[182725]: 2026-01-22 22:32:23.599 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:32:23 compute-0 nova_compute[182725]: 2026-01-22 22:32:23.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:23 compute-0 nova_compute[182725]: 2026-01-22 22:32:23.955 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:24 compute-0 ovn_controller[94850]: 2026-01-22T22:32:24Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:32:24 compute-0 nova_compute[182725]: 2026-01-22 22:32:24.519 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:24 compute-0 nova_compute[182725]: 2026-01-22 22:32:24.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:24 compute-0 nova_compute[182725]: 2026-01-22 22:32:24.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:24 compute-0 nova_compute[182725]: 2026-01-22 22:32:24.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:32:25 compute-0 nova_compute[182725]: 2026-01-22 22:32:25.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:32:27 compute-0 nova_compute[182725]: 2026-01-22 22:32:27.831 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:27 compute-0 nova_compute[182725]: 2026-01-22 22:32:27.977 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:28 compute-0 nova_compute[182725]: 2026-01-22 22:32:28.446 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:28 compute-0 ovn_controller[94850]: 2026-01-22T22:32:28Z|00321|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:32:28 compute-0 nova_compute[182725]: 2026-01-22 22:32:28.661 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:28 compute-0 nova_compute[182725]: 2026-01-22 22:32:28.958 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:29 compute-0 podman[224008]: 2026-01-22 22:32:29.135947987 +0000 UTC m=+0.056807321 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:32:29 compute-0 podman[224009]: 2026-01-22 22:32:29.137560288 +0000 UTC m=+0.054526834 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:32:30 compute-0 nova_compute[182725]: 2026-01-22 22:32:30.025 182729 DEBUG oslo_concurrency.lockutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:30 compute-0 nova_compute[182725]: 2026-01-22 22:32:30.026 182729 DEBUG oslo_concurrency.lockutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:30 compute-0 nova_compute[182725]: 2026-01-22 22:32:30.026 182729 INFO nova.compute.manager [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Rebooting instance
Jan 22 22:32:30 compute-0 nova_compute[182725]: 2026-01-22 22:32:30.053 182729 DEBUG oslo_concurrency.lockutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:32:30 compute-0 nova_compute[182725]: 2026-01-22 22:32:30.053 182729 DEBUG oslo_concurrency.lockutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:32:30 compute-0 nova_compute[182725]: 2026-01-22 22:32:30.054 182729 DEBUG nova.network.neutron [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.090 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.252 182729 DEBUG nova.network.neutron [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.277 182729 DEBUG oslo_concurrency.lockutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.289 182729 DEBUG nova.compute.manager [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:32 compute-0 kernel: tapb56a4401-4c (unregistering): left promiscuous mode
Jan 22 22:32:32 compute-0 NetworkManager[54954]: <info>  [1769121152.5391] device (tapb56a4401-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:32:32 compute-0 ovn_controller[94850]: 2026-01-22T22:32:32Z|00322|binding|INFO|Releasing lport b56a4401-4c89-482a-a347-ca080a879f8f from this chassis (sb_readonly=0)
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.545 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 ovn_controller[94850]: 2026-01-22T22:32:32Z|00323|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f down in Southbound
Jan 22 22:32:32 compute-0 ovn_controller[94850]: 2026-01-22T22:32:32Z|00324|binding|INFO|Removing iface tapb56a4401-4c ovn-installed in OVS
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.547 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.553 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.554 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.556 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.557 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf1ed27-5915-4101-9754-b6eaf07af8ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.558 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.564 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 22 22:32:32 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000057.scope: Consumed 12.975s CPU time.
Jan 22 22:32:32 compute-0 systemd-machined[154006]: Machine qemu-39-instance-00000057 terminated.
Jan 22 22:32:32 compute-0 podman[224053]: 2026-01-22 22:32:32.615557234 +0000 UTC m=+0.050115023 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:32:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [NOTICE]   (223813) : haproxy version is 2.8.14-c23fe91
Jan 22 22:32:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [NOTICE]   (223813) : path to executable is /usr/sbin/haproxy
Jan 22 22:32:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [WARNING]  (223813) : Exiting Master process...
Jan 22 22:32:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [WARNING]  (223813) : Exiting Master process...
Jan 22 22:32:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [ALERT]    (223813) : Current worker (223815) exited with code 143 (Terminated)
Jan 22 22:32:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[223809]: [WARNING]  (223813) : All workers exited. Exiting... (0)
Jan 22 22:32:32 compute-0 systemd[1]: libpod-86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692.scope: Deactivated successfully.
Jan 22 22:32:32 compute-0 podman[224096]: 2026-01-22 22:32:32.69443522 +0000 UTC m=+0.047257571 container died 86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.713 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.717 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692-userdata-shm.mount: Deactivated successfully.
Jan 22 22:32:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a2b7455e8b4553992d821de71eefd436ef75c65193fbc253a68f11ecc65be6a-merged.mount: Deactivated successfully.
Jan 22 22:32:32 compute-0 podman[224096]: 2026-01-22 22:32:32.742741486 +0000 UTC m=+0.095563817 container cleanup 86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:32:32 compute-0 systemd[1]: libpod-conmon-86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692.scope: Deactivated successfully.
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.752 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance destroyed successfully.
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.752 182729 DEBUG nova.objects.instance [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.786 182729 DEBUG nova.virt.libvirt.vif [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.786 182729 DEBUG nova.network.os_vif_util [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.787 182729 DEBUG nova.network.os_vif_util [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.787 182729 DEBUG os_vif [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.788 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.789 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb56a4401-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.791 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.795 182729 INFO os_vif [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.802 182729 DEBUG nova.virt.libvirt.driver [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Start _get_guest_xml network_info=[{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.805 182729 WARNING nova.virt.libvirt.driver [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.818 182729 DEBUG nova.virt.libvirt.host [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.819 182729 DEBUG nova.virt.libvirt.host [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:32:32 compute-0 podman[224140]: 2026-01-22 22:32:32.823811008 +0000 UTC m=+0.056621717 container remove 86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.827 182729 DEBUG nova.virt.libvirt.host [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.828 182729 DEBUG nova.virt.libvirt.host [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.829 182729 DEBUG nova.virt.libvirt.driver [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.829 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.829 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2558d12f-d4ea-4216-9381-310c577f2196]: (4, ('Thu Jan 22 10:32:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692)\n86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692\nThu Jan 22 10:32:32 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692)\n86437de686486f785a204856f10e2c38099cef6b116e7e6d4c463f56030a5692\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.830 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.830 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.830 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.830 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.831 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.831 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.831 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.831 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.832 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.831 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b91b11-1d98-43d5-890c-491ccc3fc3fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.832 182729 DEBUG nova.virt.hardware [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.832 182729 DEBUG nova.objects.instance [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'vcpu_model' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.832 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.835 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.842 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1fbd59-0008-4fc7-9666-ad9bb088317c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.850 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.865 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f109d5d7-689b-407a-99b0-55936cfc51cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.866 182729 DEBUG nova.virt.libvirt.vif [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.867 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa37835-41ad-45b9-852f-3a76ea7abadb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.867 182729 DEBUG nova.network.os_vif_util [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.868 182729 DEBUG nova.network.os_vif_util [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.870 182729 DEBUG nova.objects.instance [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.884 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f65b0e9f-096d-42b5-a647-71d00ebc64e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476527, 'reachable_time': 40460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224153, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.888 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:32:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:32.888 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[01499d71-2526-4449-9ce9-35a7186fdfa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.898 182729 DEBUG nova.virt.libvirt.driver [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <uuid>454ec87b-a45c-40af-8bce-d252eea19620</uuid>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <name>instance-00000057</name>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-256562799</nova:name>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:32:32</nova:creationTime>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         <nova:port uuid="b56a4401-4c89-482a-a347-ca080a879f8f">
Jan 22 22:32:32 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <system>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <entry name="serial">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <entry name="uuid">454ec87b-a45c-40af-8bce-d252eea19620</entry>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </system>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <os>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   </os>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <features>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   </features>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:f0:b5:89"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <target dev="tapb56a4401-4c"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/console.log" append="off"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <video>
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </video>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:32:32 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:32:32 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:32:32 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:32:32 compute-0 nova_compute[182725]: </domain>
Jan 22 22:32:32 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.899 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.951 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121137.949618, 815ebbb8-e2c4-4f72-8048-df7c53f1439a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.952 182729 INFO nova.compute.manager [-] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] VM Stopped (Lifecycle Event)
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.963 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.964 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:32 compute-0 nova_compute[182725]: 2026-01-22 22:32:32.997 182729 DEBUG nova.compute.manager [None req-58ec257d-5a96-4f02-a5f6-1c4681afe397 - - - - - -] [instance: 815ebbb8-e2c4-4f72-8048-df7c53f1439a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.025 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.027 182729 DEBUG nova.objects.instance [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'trusted_certs' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.041 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.108 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.109 182729 DEBUG nova.virt.disk.api [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.109 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.172 182729 DEBUG oslo_concurrency.processutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.173 182729 DEBUG nova.virt.disk.api [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.174 182729 DEBUG nova.objects.instance [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.191 182729 DEBUG nova.virt.libvirt.vif [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.192 182729 DEBUG nova.network.os_vif_util [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.193 182729 DEBUG nova.network.os_vif_util [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.193 182729 DEBUG os_vif [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.194 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.195 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.195 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.198 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.198 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb56a4401-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.199 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb56a4401-4c, col_values=(('external_ids', {'iface-id': 'b56a4401-4c89-482a-a347-ca080a879f8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:b5:89', 'vm-uuid': '454ec87b-a45c-40af-8bce-d252eea19620'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 NetworkManager[54954]: <info>  [1769121153.2013] manager: (tapb56a4401-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.202 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.207 182729 INFO os_vif [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:32:33 compute-0 kernel: tapb56a4401-4c: entered promiscuous mode
Jan 22 22:32:33 compute-0 NetworkManager[54954]: <info>  [1769121153.3061] manager: (tapb56a4401-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/162)
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.306 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 ovn_controller[94850]: 2026-01-22T22:32:33Z|00325|binding|INFO|Claiming lport b56a4401-4c89-482a-a347-ca080a879f8f for this chassis.
Jan 22 22:32:33 compute-0 ovn_controller[94850]: 2026-01-22T22:32:33Z|00326|binding|INFO|b56a4401-4c89-482a-a347-ca080a879f8f: Claiming fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:32:33 compute-0 systemd-udevd[224066]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.320 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:33 compute-0 ovn_controller[94850]: 2026-01-22T22:32:33Z|00327|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f ovn-installed in OVS
Jan 22 22:32:33 compute-0 ovn_controller[94850]: 2026-01-22T22:32:33Z|00328|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f up in Southbound
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.321 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.323 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 NetworkManager[54954]: <info>  [1769121153.3256] device (tapb56a4401-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:32:33 compute-0 NetworkManager[54954]: <info>  [1769121153.3272] device (tapb56a4401-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.330 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.337 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[19a42918-8655-4338-9625-226ae4fa64fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.339 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.340 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.341 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc054b7-aba8-4593-9ddb-2da56e84179c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.341 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[049c4ecf-6f34-4137-8922-2eee6d5e708a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 systemd-machined[154006]: New machine qemu-40-instance-00000057.
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.357 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[3452c560-17e6-46a3-b112-d0eb59722012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-00000057.
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.374 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7672a6aa-e495-4db3-a6a7-91085c5703e2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.411 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[35805194-2a93-4476-a278-06ad8f30ba59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.417 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[77727407-92ce-4681-9ae3-467f2f0b0cd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 NetworkManager[54954]: <info>  [1769121153.4181] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/163)
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.465 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[0c287d7d-a780-47d1-8ad8-1ab423665622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.468 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[5c78968b-5a57-448d-993a-bc2e24704699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 NetworkManager[54954]: <info>  [1769121153.4943] device (tape65877e5-00): carrier: link connected
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.501 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3d2cf4-01f6-4307-8e66-e16ef7847e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.524 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2d83ce7d-b0e1-4d0b-a114-868e614d37cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479109, 'reachable_time': 37648, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224212, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.540 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cda2248b-aeba-458f-bb17-2acf7ce099a0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 479109, 'tstamp': 479109}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224213, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.568 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4de43f2b-b48d-4771-a912-530e970ec533]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479109, 'reachable_time': 37648, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224214, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.613 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb3ff68-d8c5-4c68-84b8-93357db0074e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.690 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b77b61b4-00f2-45a9-8695-b4deac56da0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.692 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.693 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.693 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:33 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.695 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 NetworkManager[54954]: <info>  [1769121153.6963] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.701 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.702 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 ovn_controller[94850]: 2026-01-22T22:32:33Z|00329|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.705 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.706 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5a309a43-38ca-4ea8-bd4b-40eed599187a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.707 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:32:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:33.708 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.716 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:33 compute-0 nova_compute[182725]: 2026-01-22 22:32:33.960 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.077 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 454ec87b-a45c-40af-8bce-d252eea19620 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.078 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121154.0769763, 454ec87b-a45c-40af-8bce-d252eea19620 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.078 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Resumed (Lifecycle Event)
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.088 182729 DEBUG nova.compute.manager [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:32:34 compute-0 podman[224252]: 2026-01-22 22:32:34.089103579 +0000 UTC m=+0.045260321 container create ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.092 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance rebooted successfully.
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.093 182729 DEBUG nova.compute.manager [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.102 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.105 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:34 compute-0 systemd[1]: Started libpod-conmon-ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b.scope.
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.128 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.129 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121154.087188, 454ec87b-a45c-40af-8bce-d252eea19620 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.129 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Started (Lifecycle Event)
Jan 22 22:32:34 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.154 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e080680cc5f28728dc0d89d678f10cd8eac5e11c27d0168116f25ddc5eddb875/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.159 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:34 compute-0 podman[224252]: 2026-01-22 22:32:34.066020487 +0000 UTC m=+0.022177259 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:32:34 compute-0 nova_compute[182725]: 2026-01-22 22:32:34.176 182729 DEBUG oslo_concurrency.lockutils [None req-9082cfbb-633d-43be-964f-db1d36030059 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:34 compute-0 podman[224252]: 2026-01-22 22:32:34.177758321 +0000 UTC m=+0.133915163 container init ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:32:34 compute-0 podman[224252]: 2026-01-22 22:32:34.185026694 +0000 UTC m=+0.141183436 container start ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:32:34 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[224269]: [NOTICE]   (224273) : New worker (224275) forked
Jan 22 22:32:34 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[224269]: [NOTICE]   (224273) : Loading success.
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.290 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.291 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.307 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.439 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.439 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.445 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.446 182729 INFO nova.compute.claims [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.582 182729 DEBUG nova.compute.provider_tree [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.599 182729 DEBUG nova.scheduler.client.report [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.632 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.633 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.703 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.703 182729 DEBUG nova.network.neutron [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.719 182729 INFO nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.740 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.860 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.861 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.862 182729 INFO nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Creating image(s)
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.862 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "/var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.863 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "/var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.863 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "/var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.875 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.954 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.956 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.957 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:36 compute-0 nova_compute[182725]: 2026-01-22 22:32:36.968 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.026 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.028 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.060 182729 DEBUG nova.policy [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '018d4fb3a46e4c0893d7b27e056744ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efb3126bbf024dfe96369b57d5971e94', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.073 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.074 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.074 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.137 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.138 182729 DEBUG nova.virt.disk.api [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Checking if we can resize image /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.138 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.201 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.201 182729 DEBUG nova.virt.disk.api [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Cannot resize image /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.202 182729 DEBUG nova.objects.instance [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lazy-loading 'migration_context' on Instance uuid 0ab183cd-e3db-41c8-b78e-ce285370129b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.218 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.218 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Ensure instance console log exists: /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.219 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.219 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.219 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.652 182729 DEBUG nova.compute.manager [req-554cd5b6-da0a-4212-aa11-237c0f08e327 req-f8a68a85-15b2-4839-9cc7-78ccec6ff38a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.653 182729 DEBUG oslo_concurrency.lockutils [req-554cd5b6-da0a-4212-aa11-237c0f08e327 req-f8a68a85-15b2-4839-9cc7-78ccec6ff38a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.654 182729 DEBUG oslo_concurrency.lockutils [req-554cd5b6-da0a-4212-aa11-237c0f08e327 req-f8a68a85-15b2-4839-9cc7-78ccec6ff38a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.654 182729 DEBUG oslo_concurrency.lockutils [req-554cd5b6-da0a-4212-aa11-237c0f08e327 req-f8a68a85-15b2-4839-9cc7-78ccec6ff38a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.655 182729 DEBUG nova.compute.manager [req-554cd5b6-da0a-4212-aa11-237c0f08e327 req-f8a68a85-15b2-4839-9cc7-78ccec6ff38a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.655 182729 WARNING nova.compute.manager [req-554cd5b6-da0a-4212-aa11-237c0f08e327 req-f8a68a85-15b2-4839-9cc7-78ccec6ff38a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:32:37 compute-0 nova_compute[182725]: 2026-01-22 22:32:37.712 182729 DEBUG nova.network.neutron [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Successfully created port: c6243b5a-135b-4ae9-8a30-87c11fd8f953 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.203 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.673 182729 DEBUG nova.network.neutron [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Successfully updated port: c6243b5a-135b-4ae9-8a30-87c11fd8f953 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.685 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "refresh_cache-0ab183cd-e3db-41c8-b78e-ce285370129b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.686 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquired lock "refresh_cache-0ab183cd-e3db-41c8-b78e-ce285370129b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.686 182729 DEBUG nova.network.neutron [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.798 182729 DEBUG nova.compute.manager [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-changed-c6243b5a-135b-4ae9-8a30-87c11fd8f953 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.799 182729 DEBUG nova.compute.manager [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Refreshing instance network info cache due to event network-changed-c6243b5a-135b-4ae9-8a30-87c11fd8f953. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.799 182729 DEBUG oslo_concurrency.lockutils [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-0ab183cd-e3db-41c8-b78e-ce285370129b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.862 182729 DEBUG nova.network.neutron [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:32:38 compute-0 nova_compute[182725]: 2026-01-22 22:32:38.964 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.783 182729 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.784 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.785 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.785 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.786 182729 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.786 182729 WARNING nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.787 182729 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.788 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.788 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.789 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.789 182729 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.790 182729 WARNING nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.790 182729 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.791 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.791 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.792 182729 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.793 182729 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:39 compute-0 nova_compute[182725]: 2026-01-22 22:32:39.793 182729 WARNING nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state None.
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.152 182729 DEBUG nova.network.neutron [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Updating instance_info_cache with network_info: [{"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.179 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Releasing lock "refresh_cache-0ab183cd-e3db-41c8-b78e-ce285370129b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.179 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Instance network_info: |[{"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.180 182729 DEBUG oslo_concurrency.lockutils [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-0ab183cd-e3db-41c8-b78e-ce285370129b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.180 182729 DEBUG nova.network.neutron [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Refreshing network info cache for port c6243b5a-135b-4ae9-8a30-87c11fd8f953 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.182 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Start _get_guest_xml network_info=[{"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.188 182729 WARNING nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.194 182729 DEBUG nova.virt.libvirt.host [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.194 182729 DEBUG nova.virt.libvirt.host [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.198 182729 DEBUG nova.virt.libvirt.host [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.199 182729 DEBUG nova.virt.libvirt.host [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.201 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.201 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.201 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.201 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.202 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.202 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.202 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.202 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.202 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.203 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.203 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.203 182729 DEBUG nova.virt.hardware [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.206 182729 DEBUG nova.virt.libvirt.vif [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:32:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-896050795',display_name='tempest-NoVNCConsoleTestJSON-server-896050795',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-896050795',id=95,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efb3126bbf024dfe96369b57d5971e94',ramdisk_id='',reservation_id='r-mbxnap8q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-904362672',owner_user_name='tempest-NoVNCConsoleTestJSON-904362672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:36Z,user_data=None,user_id='018d4fb3a46e4c0893d7b27e056744ff',uuid=0ab183cd-e3db-41c8-b78e-ce285370129b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.207 182729 DEBUG nova.network.os_vif_util [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Converting VIF {"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.208 182729 DEBUG nova.network.os_vif_util [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=c6243b5a-135b-4ae9-8a30-87c11fd8f953,network=Network(d01af9a8-5c81-4331-a414-f923e336df0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6243b5a-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.209 182729 DEBUG nova.objects.instance [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0ab183cd-e3db-41c8-b78e-ce285370129b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.232 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <uuid>0ab183cd-e3db-41c8-b78e-ce285370129b</uuid>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <name>instance-0000005f</name>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <nova:name>tempest-NoVNCConsoleTestJSON-server-896050795</nova:name>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:32:40</nova:creationTime>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:user uuid="018d4fb3a46e4c0893d7b27e056744ff">tempest-NoVNCConsoleTestJSON-904362672-project-member</nova:user>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:project uuid="efb3126bbf024dfe96369b57d5971e94">tempest-NoVNCConsoleTestJSON-904362672</nova:project>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         <nova:port uuid="c6243b5a-135b-4ae9-8a30-87c11fd8f953">
Jan 22 22:32:40 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <system>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <entry name="serial">0ab183cd-e3db-41c8-b78e-ce285370129b</entry>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <entry name="uuid">0ab183cd-e3db-41c8-b78e-ce285370129b</entry>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </system>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <os>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   </os>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <features>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   </features>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk.config"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:aa:a6:7a"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <target dev="tapc6243b5a-13"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/console.log" append="off"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <video>
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </video>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:32:40 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:32:40 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:32:40 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:32:40 compute-0 nova_compute[182725]: </domain>
Jan 22 22:32:40 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.234 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Preparing to wait for external event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.234 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.234 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.234 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.235 182729 DEBUG nova.virt.libvirt.vif [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:32:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-896050795',display_name='tempest-NoVNCConsoleTestJSON-server-896050795',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-896050795',id=95,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efb3126bbf024dfe96369b57d5971e94',ramdisk_id='',reservation_id='r-mbxnap8q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-904362672',owner_user_name='tempest-NoVNCConsoleTestJSON-904362672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:36Z,user_data=None,user_id='018d4fb3a46e4c0893d7b27e056744ff',uuid=0ab183cd-e3db-41c8-b78e-ce285370129b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.235 182729 DEBUG nova.network.os_vif_util [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Converting VIF {"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.236 182729 DEBUG nova.network.os_vif_util [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=c6243b5a-135b-4ae9-8a30-87c11fd8f953,network=Network(d01af9a8-5c81-4331-a414-f923e336df0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6243b5a-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.236 182729 DEBUG os_vif [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=c6243b5a-135b-4ae9-8a30-87c11fd8f953,network=Network(d01af9a8-5c81-4331-a414-f923e336df0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6243b5a-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.237 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.237 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.237 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.243 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.243 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6243b5a-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.244 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6243b5a-13, col_values=(('external_ids', {'iface-id': 'c6243b5a-135b-4ae9-8a30-87c11fd8f953', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:a6:7a', 'vm-uuid': '0ab183cd-e3db-41c8-b78e-ce285370129b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.245 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:40 compute-0 NetworkManager[54954]: <info>  [1769121160.2492] manager: (tapc6243b5a-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.249 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.254 182729 INFO os_vif [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=c6243b5a-135b-4ae9-8a30-87c11fd8f953,network=Network(d01af9a8-5c81-4331-a414-f923e336df0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6243b5a-13')
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.308 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.309 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.309 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] No VIF found with MAC fa:16:3e:aa:a6:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.310 182729 INFO nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Using config drive
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.638 182729 INFO nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Creating config drive at /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk.config
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.644 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgz8o0spc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.776 182729 DEBUG oslo_concurrency.processutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgz8o0spc" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:32:40 compute-0 kernel: tapc6243b5a-13: entered promiscuous mode
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.854 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:40 compute-0 ovn_controller[94850]: 2026-01-22T22:32:40Z|00330|binding|INFO|Claiming lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 for this chassis.
Jan 22 22:32:40 compute-0 ovn_controller[94850]: 2026-01-22T22:32:40Z|00331|binding|INFO|c6243b5a-135b-4ae9-8a30-87c11fd8f953: Claiming fa:16:3e:aa:a6:7a 10.100.0.4
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.869 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:a6:7a 10.100.0.4'], port_security=['fa:16:3e:aa:a6:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0ab183cd-e3db-41c8-b78e-ce285370129b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d01af9a8-5c81-4331-a414-f923e336df0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efb3126bbf024dfe96369b57d5971e94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0cf04abb-f4c5-41e8-8658-0e02a08f6dd7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=243038a4-69be-4528-a8f5-0e56493a826a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c6243b5a-135b-4ae9-8a30-87c11fd8f953) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:40 compute-0 NetworkManager[54954]: <info>  [1769121160.8710] manager: (tapc6243b5a-13): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.871 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c6243b5a-135b-4ae9-8a30-87c11fd8f953 in datapath d01af9a8-5c81-4331-a414-f923e336df0b bound to our chassis
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.873 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d01af9a8-5c81-4331-a414-f923e336df0b
Jan 22 22:32:40 compute-0 ovn_controller[94850]: 2026-01-22T22:32:40Z|00332|binding|INFO|Setting lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 ovn-installed in OVS
Jan 22 22:32:40 compute-0 ovn_controller[94850]: 2026-01-22T22:32:40Z|00333|binding|INFO|Setting lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 up in Southbound
Jan 22 22:32:40 compute-0 nova_compute[182725]: 2026-01-22 22:32:40.888 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:40 compute-0 systemd-udevd[224321]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:32:40 compute-0 systemd-machined[154006]: New machine qemu-41-instance-0000005f.
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.898 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9cc207-96e5-4d55-8f2d-462e693b6df6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.902 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd01af9a8-51 in ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.904 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd01af9a8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.904 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c31d23-6667-4307-93cd-dd2d6e488b70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:40 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-0000005f.
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.905 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a026fafd-edab-4912-9dbb-f334f955a36d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:40 compute-0 NetworkManager[54954]: <info>  [1769121160.9169] device (tapc6243b5a-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:32:40 compute-0 NetworkManager[54954]: <info>  [1769121160.9175] device (tapc6243b5a-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.922 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[af4bb8b8-b5f7-4503-8c67-9883dee99b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.945 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e834bb-8f21-4772-a283-008262aa1643]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.976 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[661dab36-62a8-4d8e-8fa0-68b67776b698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:40 compute-0 systemd-udevd[224324]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:32:40 compute-0 NetworkManager[54954]: <info>  [1769121160.9845] manager: (tapd01af9a8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Jan 22 22:32:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:40.983 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[55da4545-7d6b-4137-a517-b753130ca394]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.017 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[26d230bb-407a-4669-80b0-54b39a2839c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.021 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3eec860e-70ab-45c6-9399-5e3823177309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 NetworkManager[54954]: <info>  [1769121161.0434] device (tapd01af9a8-50): carrier: link connected
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.048 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb70070-a662-4cd4-b7e7-764224dc06aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.064 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9b9ec9-6291-4396-ae38-5bbea3fd2527]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd01af9a8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:b1:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479864, 'reachable_time': 29425, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224353, 'error': None, 'target': 'ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.083 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fecd9e65-a2e7-455d-beea-6a20ce7672e9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:b146'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 479864, 'tstamp': 479864}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224354, 'error': None, 'target': 'ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.097 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1b56a1fb-6a2c-4860-ab99-43135190d3f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd01af9a8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:b1:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479864, 'reachable_time': 29425, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224355, 'error': None, 'target': 'ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.139 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b139cd-220c-46d1-bcac-6644ab11fbd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.204 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c63be529-eded-4fda-b7a6-6cff68fb8ecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.213 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd01af9a8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.213 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.214 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd01af9a8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.216 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:41 compute-0 NetworkManager[54954]: <info>  [1769121161.2170] manager: (tapd01af9a8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 22 22:32:41 compute-0 kernel: tapd01af9a8-50: entered promiscuous mode
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.257 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd01af9a8-50, col_values=(('external_ids', {'iface-id': '52c616a3-f3c8-41f1-a978-804a1570a8ca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.259 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:41 compute-0 ovn_controller[94850]: 2026-01-22T22:32:41Z|00334|binding|INFO|Releasing lport 52c616a3-f3c8-41f1-a978-804a1570a8ca from this chassis (sb_readonly=0)
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.260 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.262 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d01af9a8-5c81-4331-a414-f923e336df0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d01af9a8-5c81-4331-a414-f923e336df0b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.263 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ad152373-425f-4e14-ba7e-1603ab6dd573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.264 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-d01af9a8-5c81-4331-a414-f923e336df0b
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/d01af9a8-5c81-4331-a414-f923e336df0b.pid.haproxy
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID d01af9a8-5c81-4331-a414-f923e336df0b
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:32:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:41.265 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b', 'env', 'PROCESS_TAG=haproxy-d01af9a8-5c81-4331-a414-f923e336df0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d01af9a8-5c81-4331-a414-f923e336df0b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.271 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.445 182729 DEBUG nova.compute.manager [req-e94f67af-cfc1-4c1c-abe7-3a970d096443 req-ada63786-25aa-4e02-8f8e-e1d87c80ee74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.445 182729 DEBUG oslo_concurrency.lockutils [req-e94f67af-cfc1-4c1c-abe7-3a970d096443 req-ada63786-25aa-4e02-8f8e-e1d87c80ee74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.446 182729 DEBUG oslo_concurrency.lockutils [req-e94f67af-cfc1-4c1c-abe7-3a970d096443 req-ada63786-25aa-4e02-8f8e-e1d87c80ee74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.446 182729 DEBUG oslo_concurrency.lockutils [req-e94f67af-cfc1-4c1c-abe7-3a970d096443 req-ada63786-25aa-4e02-8f8e-e1d87c80ee74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.447 182729 DEBUG nova.compute.manager [req-e94f67af-cfc1-4c1c-abe7-3a970d096443 req-ada63786-25aa-4e02-8f8e-e1d87c80ee74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Processing event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.499 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121161.4986467, 0ab183cd-e3db-41c8-b78e-ce285370129b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.499 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] VM Started (Lifecycle Event)
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.501 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.513 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.517 182729 INFO nova.virt.libvirt.driver [-] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Instance spawned successfully.
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.518 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.565 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.566 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.566 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.567 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.567 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.567 182729 DEBUG nova.virt.libvirt.driver [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.572 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.576 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.616 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.617 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121161.4988048, 0ab183cd-e3db-41c8-b78e-ce285370129b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.617 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] VM Paused (Lifecycle Event)
Jan 22 22:32:41 compute-0 podman[224393]: 2026-01-22 22:32:41.679648683 +0000 UTC m=+0.064353892 container create d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:32:41 compute-0 systemd[1]: Started libpod-conmon-d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11.scope.
Jan 22 22:32:41 compute-0 podman[224393]: 2026-01-22 22:32:41.653138515 +0000 UTC m=+0.037843754 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:32:41 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:32:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f6c98ae5184500a8614df6c2ea8a04110832022d3f75c7bde1678484cc55414/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.770 182729 INFO nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Took 4.91 seconds to spawn the instance on the hypervisor.
Jan 22 22:32:41 compute-0 podman[224393]: 2026-01-22 22:32:41.770842729 +0000 UTC m=+0.155547938 container init d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.770 182729 DEBUG nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:41 compute-0 podman[224393]: 2026-01-22 22:32:41.776281486 +0000 UTC m=+0.160986695 container start d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 22:32:41 compute-0 neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b[224408]: [NOTICE]   (224412) : New worker (224414) forked
Jan 22 22:32:41 compute-0 neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b[224408]: [NOTICE]   (224412) : Loading success.
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.835 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.840 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121161.5123522, 0ab183cd-e3db-41c8-b78e-ce285370129b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.841 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] VM Resumed (Lifecycle Event)
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.887 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.891 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:32:41 compute-0 nova_compute[182725]: 2026-01-22 22:32:41.922 182729 INFO nova.compute.manager [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Took 5.52 seconds to build instance.
Jan 22 22:32:42 compute-0 nova_compute[182725]: 2026-01-22 22:32:42.031 182729 DEBUG nova.network.neutron [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Updated VIF entry in instance network info cache for port c6243b5a-135b-4ae9-8a30-87c11fd8f953. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:32:42 compute-0 nova_compute[182725]: 2026-01-22 22:32:42.032 182729 DEBUG nova.network.neutron [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Updating instance_info_cache with network_info: [{"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:42 compute-0 nova_compute[182725]: 2026-01-22 22:32:42.241 182729 DEBUG oslo_concurrency.lockutils [None req-d5de2a96-dce6-4bde-ae59-db266c2ae6e4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.950s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:42 compute-0 nova_compute[182725]: 2026-01-22 22:32:42.606 182729 DEBUG oslo_concurrency.lockutils [req-a72b95df-1098-48a2-a515-0ede6997d8b9 req-3596bbde-578d-4f42-afc3-8d9526a3463a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-0ab183cd-e3db-41c8-b78e-ce285370129b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.260 182729 DEBUG nova.compute.manager [None req-ea5bc9d7-8a03-49fc-8636-43143f9b2a07 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.560 182729 DEBUG nova.compute.manager [req-88249286-0fe1-46c6-ab65-168961da45a1 req-6b180fec-4bdf-4650-98a9-074a23e34ff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.560 182729 DEBUG oslo_concurrency.lockutils [req-88249286-0fe1-46c6-ab65-168961da45a1 req-6b180fec-4bdf-4650-98a9-074a23e34ff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.561 182729 DEBUG oslo_concurrency.lockutils [req-88249286-0fe1-46c6-ab65-168961da45a1 req-6b180fec-4bdf-4650-98a9-074a23e34ff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.561 182729 DEBUG oslo_concurrency.lockutils [req-88249286-0fe1-46c6-ab65-168961da45a1 req-6b180fec-4bdf-4650-98a9-074a23e34ff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.561 182729 DEBUG nova.compute.manager [req-88249286-0fe1-46c6-ab65-168961da45a1 req-6b180fec-4bdf-4650-98a9-074a23e34ff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] No waiting events found dispatching network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.561 182729 WARNING nova.compute.manager [req-88249286-0fe1-46c6-ab65-168961da45a1 req-6b180fec-4bdf-4650-98a9-074a23e34ff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received unexpected event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 for instance with vm_state active and task_state None.
Jan 22 22:32:43 compute-0 nova_compute[182725]: 2026-01-22 22:32:43.965 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:44 compute-0 nova_compute[182725]: 2026-01-22 22:32:44.024 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:44 compute-0 nova_compute[182725]: 2026-01-22 22:32:44.072 182729 DEBUG nova.compute.manager [None req-efc3bab4-b572-4fc1-ac92-2e34cd1d49c4 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196
Jan 22 22:32:44 compute-0 nova_compute[182725]: 2026-01-22 22:32:44.536 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:44 compute-0 nova_compute[182725]: 2026-01-22 22:32:44.537 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:44 compute-0 nova_compute[182725]: 2026-01-22 22:32:44.537 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:44 compute-0 nova_compute[182725]: 2026-01-22 22:32:44.537 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:44 compute-0 nova_compute[182725]: 2026-01-22 22:32:44.538 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:45 compute-0 nova_compute[182725]: 2026-01-22 22:32:45.246 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:45 compute-0 ovn_controller[94850]: 2026-01-22T22:32:45Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:b5:89 10.100.0.9
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.651 182729 INFO nova.compute.manager [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Terminating instance
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.687 182729 DEBUG nova.compute.manager [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:32:46 compute-0 kernel: tapc6243b5a-13 (unregistering): left promiscuous mode
Jan 22 22:32:46 compute-0 NetworkManager[54954]: <info>  [1769121166.7128] device (tapc6243b5a-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00335|binding|INFO|Releasing lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 from this chassis (sb_readonly=0)
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00336|binding|INFO|Setting lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 down in Southbound
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.719 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00337|binding|INFO|Removing iface tapc6243b5a-13 ovn-installed in OVS
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.724 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.734 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00338|binding|INFO|Releasing lport 52c616a3-f3c8-41f1-a978-804a1570a8ca from this chassis (sb_readonly=0)
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00339|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:32:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:46.751 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:a6:7a 10.100.0.4'], port_security=['fa:16:3e:aa:a6:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0ab183cd-e3db-41c8-b78e-ce285370129b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d01af9a8-5c81-4331-a414-f923e336df0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efb3126bbf024dfe96369b57d5971e94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cf04abb-f4c5-41e8-8658-0e02a08f6dd7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=243038a4-69be-4528-a8f5-0e56493a826a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c6243b5a-135b-4ae9-8a30-87c11fd8f953) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:46 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Jan 22 22:32:46 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005f.scope: Consumed 5.838s CPU time.
Jan 22 22:32:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:46.754 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c6243b5a-135b-4ae9-8a30-87c11fd8f953 in datapath d01af9a8-5c81-4331-a414-f923e336df0b unbound from our chassis
Jan 22 22:32:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:46.756 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d01af9a8-5c81-4331-a414-f923e336df0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:32:46 compute-0 systemd-machined[154006]: Machine qemu-41-instance-0000005f terminated.
Jan 22 22:32:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:46.758 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7e970a3a-7639-45e5-b280-48cbab40f2a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:46.759 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b namespace which is not needed anymore
Jan 22 22:32:46 compute-0 podman[224426]: 2026-01-22 22:32:46.809150875 +0000 UTC m=+0.072986469 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 22:32:46 compute-0 neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b[224408]: [NOTICE]   (224412) : haproxy version is 2.8.14-c23fe91
Jan 22 22:32:46 compute-0 neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b[224408]: [NOTICE]   (224412) : path to executable is /usr/sbin/haproxy
Jan 22 22:32:46 compute-0 neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b[224408]: [WARNING]  (224412) : Exiting Master process...
Jan 22 22:32:46 compute-0 neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b[224408]: [ALERT]    (224412) : Current worker (224414) exited with code 143 (Terminated)
Jan 22 22:32:46 compute-0 neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b[224408]: [WARNING]  (224412) : All workers exited. Exiting... (0)
Jan 22 22:32:46 compute-0 systemd[1]: libpod-d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11.scope: Deactivated successfully.
Jan 22 22:32:46 compute-0 podman[224468]: 2026-01-22 22:32:46.906129987 +0000 UTC m=+0.047289582 container died d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:32:46 compute-0 kernel: tapc6243b5a-13: entered promiscuous mode
Jan 22 22:32:46 compute-0 kernel: tapc6243b5a-13 (unregistering): left promiscuous mode
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.922 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00340|binding|INFO|Claiming lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 for this chassis.
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00341|binding|INFO|c6243b5a-135b-4ae9-8a30-87c11fd8f953: Claiming fa:16:3e:aa:a6:7a 10.100.0.4
Jan 22 22:32:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:46.930 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:a6:7a 10.100.0.4'], port_security=['fa:16:3e:aa:a6:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0ab183cd-e3db-41c8-b78e-ce285370129b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d01af9a8-5c81-4331-a414-f923e336df0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efb3126bbf024dfe96369b57d5971e94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cf04abb-f4c5-41e8-8658-0e02a08f6dd7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=243038a4-69be-4528-a8f5-0e56493a826a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c6243b5a-135b-4ae9-8a30-87c11fd8f953) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00342|binding|INFO|Setting lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 ovn-installed in OVS
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00343|binding|INFO|Setting lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 up in Southbound
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00344|binding|INFO|Releasing lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 from this chassis (sb_readonly=1)
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00345|if_status|INFO|Dropped 3 log messages in last 132 seconds (most recently, 132 seconds ago) due to excessive rate
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00346|if_status|INFO|Not setting lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 down as sb is readonly
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.943 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.945 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00347|binding|INFO|Removing iface tapc6243b5a-13 ovn-installed in OVS
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00348|binding|INFO|Releasing lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 from this chassis (sb_readonly=0)
Jan 22 22:32:46 compute-0 ovn_controller[94850]: 2026-01-22T22:32:46Z|00349|binding|INFO|Setting lport c6243b5a-135b-4ae9-8a30-87c11fd8f953 down in Southbound
Jan 22 22:32:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:46.955 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:a6:7a 10.100.0.4'], port_security=['fa:16:3e:aa:a6:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0ab183cd-e3db-41c8-b78e-ce285370129b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d01af9a8-5c81-4331-a414-f923e336df0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efb3126bbf024dfe96369b57d5971e94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cf04abb-f4c5-41e8-8658-0e02a08f6dd7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=243038a4-69be-4528-a8f5-0e56493a826a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c6243b5a-135b-4ae9-8a30-87c11fd8f953) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11-userdata-shm.mount: Deactivated successfully.
Jan 22 22:32:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f6c98ae5184500a8614df6c2ea8a04110832022d3f75c7bde1678484cc55414-merged.mount: Deactivated successfully.
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.962 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:46 compute-0 podman[224468]: 2026-01-22 22:32:46.968483917 +0000 UTC m=+0.109643492 container cleanup d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:32:46 compute-0 systemd[1]: libpod-conmon-d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11.scope: Deactivated successfully.
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.988 182729 INFO nova.virt.libvirt.driver [-] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Instance destroyed successfully.
Jan 22 22:32:46 compute-0 nova_compute[182725]: 2026-01-22 22:32:46.990 182729 DEBUG nova.objects.instance [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lazy-loading 'resources' on Instance uuid 0ab183cd-e3db-41c8-b78e-ce285370129b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.005 182729 DEBUG nova.virt.libvirt.vif [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:32:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-896050795',display_name='tempest-NoVNCConsoleTestJSON-server-896050795',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-896050795',id=95,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:32:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efb3126bbf024dfe96369b57d5971e94',ramdisk_id='',reservation_id='r-mbxnap8q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NoVNCConsoleTestJSON-904362672',owner_user_name='tempest-NoVNCConsoleTestJSON-904362672-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:41Z,user_data=None,user_id='018d4fb3a46e4c0893d7b27e056744ff',uuid=0ab183cd-e3db-41c8-b78e-ce285370129b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.006 182729 DEBUG nova.network.os_vif_util [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Converting VIF {"id": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "address": "fa:16:3e:aa:a6:7a", "network": {"id": "d01af9a8-5c81-4331-a414-f923e336df0b", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1625120248-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efb3126bbf024dfe96369b57d5971e94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6243b5a-13", "ovs_interfaceid": "c6243b5a-135b-4ae9-8a30-87c11fd8f953", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.008 182729 DEBUG nova.network.os_vif_util [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=c6243b5a-135b-4ae9-8a30-87c11fd8f953,network=Network(d01af9a8-5c81-4331-a414-f923e336df0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6243b5a-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.009 182729 DEBUG os_vif [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=c6243b5a-135b-4ae9-8a30-87c11fd8f953,network=Network(d01af9a8-5c81-4331-a414-f923e336df0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6243b5a-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.012 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.013 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6243b5a-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.015 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.017 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.021 182729 INFO os_vif [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=c6243b5a-135b-4ae9-8a30-87c11fd8f953,network=Network(d01af9a8-5c81-4331-a414-f923e336df0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6243b5a-13')
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.022 182729 INFO nova.virt.libvirt.driver [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Deleting instance files /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b_del
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.023 182729 INFO nova.virt.libvirt.driver [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Deletion of /var/lib/nova/instances/0ab183cd-e3db-41c8-b78e-ce285370129b_del complete
Jan 22 22:32:47 compute-0 podman[224512]: 2026-01-22 22:32:47.041466045 +0000 UTC m=+0.045091936 container remove d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.048 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e0cbc2f4-0732-40ca-94b4-8d9dd911624e]: (4, ('Thu Jan 22 10:32:46 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b (d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11)\nd56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11\nThu Jan 22 10:32:46 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b (d56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11)\nd56a4065e2231b285f9615eb162140f0b71be0dfe3e9c5e1c084532a15730e11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.051 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[00b2c19d-1bdb-4154-ba91-7b3dea3088ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.052 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd01af9a8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.054 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:47 compute-0 kernel: tapd01af9a8-50: left promiscuous mode
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.068 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.071 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c2c0b9-dd97-4d8d-873b-11db600eaff4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.084 182729 INFO nova.compute.manager [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.085 182729 DEBUG oslo.service.loopingcall [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.085 182729 DEBUG nova.compute.manager [-] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.085 182729 DEBUG nova.network.neutron [-] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.089 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5fd1cb4-8392-43cf-a2b3-ff5d6ee486e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.090 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[38de1595-0e74-4766-987c-9d605468e0a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.108 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[18ae5b43-7f87-42ab-8606-4cc44919ebb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479857, 'reachable_time': 30166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224527, 'error': None, 'target': 'ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 systemd[1]: run-netns-ovnmeta\x2dd01af9a8\x2d5c81\x2d4331\x2da414\x2df923e336df0b.mount: Deactivated successfully.
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.113 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d01af9a8-5c81-4331-a414-f923e336df0b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.113 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[59f40dee-42af-428a-b3b4-e71e0546254b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.114 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c6243b5a-135b-4ae9-8a30-87c11fd8f953 in datapath d01af9a8-5c81-4331-a414-f923e336df0b unbound from our chassis
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.115 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d01af9a8-5c81-4331-a414-f923e336df0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.116 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[19b1f55b-c967-45d0-8aac-ea143dd77591]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.116 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c6243b5a-135b-4ae9-8a30-87c11fd8f953 in datapath d01af9a8-5c81-4331-a414-f923e336df0b unbound from our chassis
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.117 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d01af9a8-5c81-4331-a414-f923e336df0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:32:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:47.118 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[89c78af8-2849-4a38-a0c1-36420477e0ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.388 182729 DEBUG nova.compute.manager [req-6c559f17-1829-4b68-a28d-9afa34ee207c req-5406428e-fc88-4048-9a4b-abbd648c9839 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-vif-unplugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.389 182729 DEBUG oslo_concurrency.lockutils [req-6c559f17-1829-4b68-a28d-9afa34ee207c req-5406428e-fc88-4048-9a4b-abbd648c9839 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.389 182729 DEBUG oslo_concurrency.lockutils [req-6c559f17-1829-4b68-a28d-9afa34ee207c req-5406428e-fc88-4048-9a4b-abbd648c9839 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.389 182729 DEBUG oslo_concurrency.lockutils [req-6c559f17-1829-4b68-a28d-9afa34ee207c req-5406428e-fc88-4048-9a4b-abbd648c9839 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.389 182729 DEBUG nova.compute.manager [req-6c559f17-1829-4b68-a28d-9afa34ee207c req-5406428e-fc88-4048-9a4b-abbd648c9839 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] No waiting events found dispatching network-vif-unplugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:47 compute-0 nova_compute[182725]: 2026-01-22 22:32:47.390 182729 DEBUG nova.compute.manager [req-6c559f17-1829-4b68-a28d-9afa34ee207c req-5406428e-fc88-4048-9a4b-abbd648c9839 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-vif-unplugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.026 182729 DEBUG nova.network.neutron [-] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.071 182729 INFO nova.compute.manager [-] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Took 0.99 seconds to deallocate network for instance.
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.191 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.192 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.262 182729 DEBUG nova.compute.provider_tree [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.278 182729 DEBUG nova.scheduler.client.report [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.299 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.322 182729 INFO nova.scheduler.client.report [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Deleted allocations for instance 0ab183cd-e3db-41c8-b78e-ce285370129b
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.403 182729 DEBUG oslo_concurrency.lockutils [None req-7db1b922-4f6e-4e23-b543-13a98d7e28c9 018d4fb3a46e4c0893d7b27e056744ff efb3126bbf024dfe96369b57d5971e94 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.850 182729 DEBUG nova.compute.manager [req-61932e51-da39-44b1-a041-12ae79e855a0 req-61eb1a48-3585-4422-9905-4b6d4cff96af 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-vif-deleted-c6243b5a-135b-4ae9-8a30-87c11fd8f953 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:48 compute-0 nova_compute[182725]: 2026-01-22 22:32:48.967 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.482 182729 DEBUG nova.compute.manager [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.483 182729 DEBUG oslo_concurrency.lockutils [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.483 182729 DEBUG oslo_concurrency.lockutils [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.483 182729 DEBUG oslo_concurrency.lockutils [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.483 182729 DEBUG nova.compute.manager [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] No waiting events found dispatching network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.483 182729 WARNING nova.compute.manager [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received unexpected event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 for instance with vm_state deleted and task_state None.
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.484 182729 DEBUG nova.compute.manager [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.484 182729 DEBUG oslo_concurrency.lockutils [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.484 182729 DEBUG oslo_concurrency.lockutils [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.484 182729 DEBUG oslo_concurrency.lockutils [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "0ab183cd-e3db-41c8-b78e-ce285370129b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.484 182729 DEBUG nova.compute.manager [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] No waiting events found dispatching network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.485 182729 WARNING nova.compute.manager [req-06320a03-7122-4129-8bb9-35d7bf31a4f7 req-f49ea394-7d0b-4fd2-8901-cb27d240d2f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Received unexpected event network-vif-plugged-c6243b5a-135b-4ae9-8a30-87c11fd8f953 for instance with vm_state deleted and task_state None.
Jan 22 22:32:49 compute-0 nova_compute[182725]: 2026-01-22 22:32:49.886 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:50 compute-0 podman[224529]: 2026-01-22 22:32:50.141730511 +0000 UTC m=+0.066069875 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal)
Jan 22 22:32:50 compute-0 podman[224528]: 2026-01-22 22:32:50.173370537 +0000 UTC m=+0.105723953 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:32:51 compute-0 ovn_controller[94850]: 2026-01-22T22:32:51Z|00350|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:32:51 compute-0 nova_compute[182725]: 2026-01-22 22:32:51.626 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:52 compute-0 nova_compute[182725]: 2026-01-22 22:32:52.016 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:53 compute-0 nova_compute[182725]: 2026-01-22 22:32:53.970 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:55 compute-0 nova_compute[182725]: 2026-01-22 22:32:55.529 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:57 compute-0 nova_compute[182725]: 2026-01-22 22:32:57.017 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:58.901 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:32:58 compute-0 nova_compute[182725]: 2026-01-22 22:32:58.902 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:32:58.902 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:32:58 compute-0 nova_compute[182725]: 2026-01-22 22:32:58.973 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:32:59 compute-0 nova_compute[182725]: 2026-01-22 22:32:59.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:00 compute-0 podman[224577]: 2026-01-22 22:33:00.1192999 +0000 UTC m=+0.054640967 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 22:33:00 compute-0 podman[224578]: 2026-01-22 22:33:00.133247151 +0000 UTC m=+0.063803157 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:33:01 compute-0 nova_compute[182725]: 2026-01-22 22:33:01.029 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:01 compute-0 nova_compute[182725]: 2026-01-22 22:33:01.986 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121166.9854095, 0ab183cd-e3db-41c8-b78e-ce285370129b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:33:01 compute-0 nova_compute[182725]: 2026-01-22 22:33:01.987 182729 INFO nova.compute.manager [-] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] VM Stopped (Lifecycle Event)
Jan 22 22:33:02 compute-0 nova_compute[182725]: 2026-01-22 22:33:02.008 182729 DEBUG nova.compute.manager [None req-7ba7687b-fa47-429b-9c53-f002dbf4c85e - - - - - -] [instance: 0ab183cd-e3db-41c8-b78e-ce285370129b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:33:02 compute-0 nova_compute[182725]: 2026-01-22 22:33:02.020 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:03 compute-0 ovn_controller[94850]: 2026-01-22T22:33:03Z|00351|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:33:03 compute-0 podman[224619]: 2026-01-22 22:33:03.125948529 +0000 UTC m=+0.056903824 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:33:03 compute-0 nova_compute[182725]: 2026-01-22 22:33:03.159 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:03 compute-0 nova_compute[182725]: 2026-01-22 22:33:03.975 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:05 compute-0 nova_compute[182725]: 2026-01-22 22:33:05.360 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:05.904 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:06 compute-0 ovn_controller[94850]: 2026-01-22T22:33:06Z|00352|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:33:07 compute-0 nova_compute[182725]: 2026-01-22 22:33:07.023 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:07 compute-0 nova_compute[182725]: 2026-01-22 22:33:07.034 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:08 compute-0 nova_compute[182725]: 2026-01-22 22:33:08.978 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:12 compute-0 nova_compute[182725]: 2026-01-22 22:33:12.026 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:12 compute-0 ovn_controller[94850]: 2026-01-22T22:33:12Z|00353|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:33:12 compute-0 nova_compute[182725]: 2026-01-22 22:33:12.245 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:12.440 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:12.441 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:12.442 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:13 compute-0 nova_compute[182725]: 2026-01-22 22:33:13.981 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:14 compute-0 nova_compute[182725]: 2026-01-22 22:33:14.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:17 compute-0 nova_compute[182725]: 2026-01-22 22:33:17.029 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:17 compute-0 podman[224644]: 2026-01-22 22:33:17.133418023 +0000 UTC m=+0.070691001 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:33:18 compute-0 nova_compute[182725]: 2026-01-22 22:33:18.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:18 compute-0 nova_compute[182725]: 2026-01-22 22:33:18.918 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:18 compute-0 nova_compute[182725]: 2026-01-22 22:33:18.920 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:18 compute-0 nova_compute[182725]: 2026-01-22 22:33:18.920 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:18 compute-0 nova_compute[182725]: 2026-01-22 22:33:18.920 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:33:18 compute-0 nova_compute[182725]: 2026-01-22 22:33:18.983 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.007 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.071 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.072 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.133 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.307 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.309 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5521MB free_disk=73.33830642700195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.309 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.310 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.445 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 454ec87b-a45c-40af-8bce-d252eea19620 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.446 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.446 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.495 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.516 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.540 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.540 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:19 compute-0 nova_compute[182725]: 2026-01-22 22:33:19.697 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:20 compute-0 nova_compute[182725]: 2026-01-22 22:33:20.229 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:20 compute-0 nova_compute[182725]: 2026-01-22 22:33:20.541 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:20 compute-0 nova_compute[182725]: 2026-01-22 22:33:20.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:20 compute-0 nova_compute[182725]: 2026-01-22 22:33:20.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:33:20 compute-0 nova_compute[182725]: 2026-01-22 22:33:20.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:33:21 compute-0 nova_compute[182725]: 2026-01-22 22:33:21.121 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:33:21 compute-0 nova_compute[182725]: 2026-01-22 22:33:21.121 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:33:21 compute-0 nova_compute[182725]: 2026-01-22 22:33:21.122 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:33:21 compute-0 nova_compute[182725]: 2026-01-22 22:33:21.122 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:33:21 compute-0 podman[224673]: 2026-01-22 22:33:21.15347558 +0000 UTC m=+0.075315558 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, distribution-scope=public)
Jan 22 22:33:21 compute-0 podman[224672]: 2026-01-22 22:33:21.198017511 +0000 UTC m=+0.129036380 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:33:22 compute-0 nova_compute[182725]: 2026-01-22 22:33:22.032 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:22 compute-0 ovn_controller[94850]: 2026-01-22T22:33:22Z|00354|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:33:22 compute-0 nova_compute[182725]: 2026-01-22 22:33:22.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:23 compute-0 nova_compute[182725]: 2026-01-22 22:33:23.112 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:33:23 compute-0 nova_compute[182725]: 2026-01-22 22:33:23.131 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:33:23 compute-0 nova_compute[182725]: 2026-01-22 22:33:23.131 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:33:23 compute-0 nova_compute[182725]: 2026-01-22 22:33:23.132 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:23 compute-0 nova_compute[182725]: 2026-01-22 22:33:23.437 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:23 compute-0 nova_compute[182725]: 2026-01-22 22:33:23.986 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:24 compute-0 nova_compute[182725]: 2026-01-22 22:33:24.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:24 compute-0 nova_compute[182725]: 2026-01-22 22:33:24.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:24 compute-0 nova_compute[182725]: 2026-01-22 22:33:24.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:24 compute-0 nova_compute[182725]: 2026-01-22 22:33:24.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:33:25 compute-0 nova_compute[182725]: 2026-01-22 22:33:25.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:26 compute-0 nova_compute[182725]: 2026-01-22 22:33:26.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:27 compute-0 nova_compute[182725]: 2026-01-22 22:33:27.035 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:28 compute-0 nova_compute[182725]: 2026-01-22 22:33:28.992 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:28 compute-0 nova_compute[182725]: 2026-01-22 22:33:28.995 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:28 compute-0 nova_compute[182725]: 2026-01-22 22:33:28.995 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.010 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.119 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.120 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.129 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.130 182729 INFO nova.compute.claims [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.281 182729 DEBUG nova.compute.provider_tree [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.296 182729 DEBUG nova.scheduler.client.report [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.316 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.317 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.388 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.389 182729 DEBUG nova.network.neutron [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.411 182729 INFO nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.434 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.548 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.550 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.550 182729 INFO nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Creating image(s)
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.551 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.551 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.552 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.567 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.620 182729 DEBUG nova.policy [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '80fc173d19874dafa5e0cbd18c7ccf24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839eb51e89b14157b8da40ae1b480ef3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.631 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.632 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.632 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.652 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.713 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.714 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.753 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.755 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.756 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.817 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.818 182729 DEBUG nova.virt.disk.api [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.818 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.879 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.880 182729 DEBUG nova.virt.disk.api [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.880 182729 DEBUG nova.objects.instance [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.882 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.912 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.913 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Ensure instance console log exists: /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.913 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.914 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:29 compute-0 nova_compute[182725]: 2026-01-22 22:33:29.914 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:30 compute-0 nova_compute[182725]: 2026-01-22 22:33:30.445 182729 DEBUG nova.network.neutron [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Successfully created port: 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:33:31 compute-0 podman[224736]: 2026-01-22 22:33:31.138288722 +0000 UTC m=+0.072087376 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:33:31 compute-0 podman[224735]: 2026-01-22 22:33:31.138234821 +0000 UTC m=+0.075898082 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.270 182729 DEBUG nova.network.neutron [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Successfully updated port: 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.290 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.290 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.291 182729 DEBUG nova.network.neutron [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.378 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.402 182729 DEBUG nova.compute.manager [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-changed-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.402 182729 DEBUG nova.compute.manager [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Refreshing instance network info cache due to event network-changed-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.403 182729 DEBUG oslo_concurrency.lockutils [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:33:31 compute-0 nova_compute[182725]: 2026-01-22 22:33:31.447 182729 DEBUG nova.network.neutron [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:33:32 compute-0 nova_compute[182725]: 2026-01-22 22:33:32.038 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.063 182729 DEBUG nova.network.neutron [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updating instance_info_cache with network_info: [{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.085 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.086 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Instance network_info: |[{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.087 182729 DEBUG oslo_concurrency.lockutils [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.087 182729 DEBUG nova.network.neutron [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Refreshing network info cache for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.092 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Start _get_guest_xml network_info=[{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.100 182729 WARNING nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.106 182729 DEBUG nova.virt.libvirt.host [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.107 182729 DEBUG nova.virt.libvirt.host [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.116 182729 DEBUG nova.virt.libvirt.host [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.117 182729 DEBUG nova.virt.libvirt.host [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.118 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.118 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.118 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.119 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.119 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.119 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.119 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.119 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.120 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.120 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.120 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.120 182729 DEBUG nova.virt.hardware [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.124 182729 DEBUG nova.virt.libvirt.vif [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1025419542',display_name='tempest-TestNetworkAdvancedServerOps-server-1025419542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1025419542',id=101,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEus4xpeVU6UA/n/yQy1kMAIvd44vHu8JBv/r1ml4Jz/KGNF+fk3mgNhSfFiD7I2hzAvoBsUSNaBpot5THfUU39PkBt2NJWCup0GtX5C21HXa7nrqKOdEjlhp77y02K9rA==',key_name='tempest-TestNetworkAdvancedServerOps-1121920412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-rq8sd9ww',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:29Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.124 182729 DEBUG nova.network.os_vif_util [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.125 182729 DEBUG nova.network.os_vif_util [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.125 182729 DEBUG nova.objects.instance [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.150 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <uuid>f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75</uuid>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <name>instance-00000065</name>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1025419542</nova:name>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:33:33</nova:creationTime>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         <nova:port uuid="3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf">
Jan 22 22:33:33 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <system>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <entry name="serial">f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75</entry>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <entry name="uuid">f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75</entry>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </system>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <os>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   </os>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <features>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   </features>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.config"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:e9:b8:63"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <target dev="tap3337d5f2-1b"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/console.log" append="off"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <video>
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </video>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:33:33 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:33:33 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:33:33 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:33:33 compute-0 nova_compute[182725]: </domain>
Jan 22 22:33:33 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.152 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Preparing to wait for external event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.153 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.153 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.154 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.154 182729 DEBUG nova.virt.libvirt.vif [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1025419542',display_name='tempest-TestNetworkAdvancedServerOps-server-1025419542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1025419542',id=101,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEus4xpeVU6UA/n/yQy1kMAIvd44vHu8JBv/r1ml4Jz/KGNF+fk3mgNhSfFiD7I2hzAvoBsUSNaBpot5THfUU39PkBt2NJWCup0GtX5C21HXa7nrqKOdEjlhp77y02K9rA==',key_name='tempest-TestNetworkAdvancedServerOps-1121920412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-rq8sd9ww',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:29Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.155 182729 DEBUG nova.network.os_vif_util [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.156 182729 DEBUG nova.network.os_vif_util [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.156 182729 DEBUG os_vif [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.157 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.157 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.158 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.162 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.162 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3337d5f2-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.163 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3337d5f2-1b, col_values=(('external_ids', {'iface-id': '3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:b8:63', 'vm-uuid': 'f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.164 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.166 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:33:33 compute-0 NetworkManager[54954]: <info>  [1769121213.1660] manager: (tap3337d5f2-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.173 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.175 182729 INFO os_vif [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b')
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.226 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.227 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.227 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No VIF found with MAC fa:16:3e:e9:b8:63, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.227 182729 INFO nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Using config drive
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.595 182729 INFO nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Creating config drive at /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.config
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.600 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc77blm9b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.733 182729 DEBUG oslo_concurrency.processutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc77blm9b" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:33 compute-0 kernel: tap3337d5f2-1b: entered promiscuous mode
Jan 22 22:33:33 compute-0 NetworkManager[54954]: <info>  [1769121213.8213] manager: (tap3337d5f2-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.822 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 ovn_controller[94850]: 2026-01-22T22:33:33Z|00355|binding|INFO|Claiming lport 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for this chassis.
Jan 22 22:33:33 compute-0 ovn_controller[94850]: 2026-01-22T22:33:33Z|00356|binding|INFO|3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf: Claiming fa:16:3e:e9:b8:63 10.100.0.11
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.831 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:b8:63 10.100.0.11'], port_security=['fa:16:3e:e9:b8:63 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41d14a29-e317-41c8-873b-57cd74687162', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a562ab8-d881-4671-b45c-5e728b84d5af, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.833 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf in datapath ab1e30f5-371b-4049-9721-63ec7c0c03c3 bound to our chassis
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.835 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ab1e30f5-371b-4049-9721-63ec7c0c03c3
Jan 22 22:33:33 compute-0 ovn_controller[94850]: 2026-01-22T22:33:33Z|00357|binding|INFO|Setting lport 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf ovn-installed in OVS
Jan 22 22:33:33 compute-0 ovn_controller[94850]: 2026-01-22T22:33:33Z|00358|binding|INFO|Setting lport 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf up in Southbound
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.842 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.844 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.854 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b3689a97-d5d8-49b5-9b2c-1912e2a9f6a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 systemd-udevd[224802]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.859 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapab1e30f5-31 in ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.862 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapab1e30f5-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.863 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c34e6f34-0256-4f13-96cc-86c9e92aa8af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.864 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1a01b35f-7d47-4a64-bc7e-fbeac12be7cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 systemd-machined[154006]: New machine qemu-42-instance-00000065.
Jan 22 22:33:33 compute-0 NetworkManager[54954]: <info>  [1769121213.8824] device (tap3337d5f2-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.880 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[55531151-a0c9-4e4b-b818-ba36d4b402d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 NetworkManager[54954]: <info>  [1769121213.8834] device (tap3337d5f2-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.896 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d10afeb2-84d7-42c5-a2cc-fdf119e0edc6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-00000065.
Jan 22 22:33:33 compute-0 podman[224786]: 2026-01-22 22:33:33.911328269 +0000 UTC m=+0.093118986 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.933 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b025e477-308b-45fb-997f-53742ba6880d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.940 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5712e19a-df16-4991-a8d4-a30fd0bc40e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 systemd-udevd[224814]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:33:33 compute-0 NetworkManager[54954]: <info>  [1769121213.9433] manager: (tapab1e30f5-30): new Veth device (/org/freedesktop/NetworkManager/Devices/171)
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.975 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7739db6e-8a9f-441f-a3b9-76686eb97502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:33.978 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7845adcb-8851-4484-804e-510999e1b65b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:33 compute-0 nova_compute[182725]: 2026-01-22 22:33:33.991 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:33 compute-0 NetworkManager[54954]: <info>  [1769121213.9991] device (tapab1e30f5-30): carrier: link connected
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.006 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3a23c2-e731-4c3b-88b1-aa6866b26b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.025 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d8e33b-5bc8-4d59-acc3-07e66ec95770]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab1e30f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:ca:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485160, 'reachable_time': 17680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224851, 'error': None, 'target': 'ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.043 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c571dc19-cda9-435e-8ae7-8c9fb3c51652]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fece:ca79'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485160, 'tstamp': 485160}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224852, 'error': None, 'target': 'ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.062 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b7639bd4-bcdf-4f6a-af89-dab40037cb8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab1e30f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:ca:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485160, 'reachable_time': 17680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224853, 'error': None, 'target': 'ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.101 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[92105fbc-0030-4d99-8ce5-60d0975b439a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.181 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f2941318-bf00-4946-99a6-0cf77db37706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.184 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab1e30f5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.184 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.185 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab1e30f5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.188 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:34 compute-0 kernel: tapab1e30f5-30: entered promiscuous mode
Jan 22 22:33:34 compute-0 NetworkManager[54954]: <info>  [1769121214.1885] manager: (tapab1e30f5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.191 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapab1e30f5-30, col_values=(('external_ids', {'iface-id': 'bd5d6f7a-cf06-4349-aba6-14eec937270d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:34 compute-0 ovn_controller[94850]: 2026-01-22T22:33:34Z|00359|binding|INFO|Releasing lport bd5d6f7a-cf06-4349-aba6-14eec937270d from this chassis (sb_readonly=0)
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.216 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.217 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ab1e30f5-371b-4049-9721-63ec7c0c03c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ab1e30f5-371b-4049-9721-63ec7c0c03c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.219 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[522fa23d-8c92-43d4-9190-97ea5afbe689]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.220 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-ab1e30f5-371b-4049-9721-63ec7c0c03c3
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/ab1e30f5-371b-4049-9721-63ec7c0c03c3.pid.haproxy
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID ab1e30f5-371b-4049-9721-63ec7c0c03c3
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:33:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:34.221 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'env', 'PROCESS_TAG=haproxy-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ab1e30f5-371b-4049-9721-63ec7c0c03c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.468 182729 DEBUG nova.network.neutron [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updated VIF entry in instance network info cache for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.469 182729 DEBUG nova.network.neutron [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updating instance_info_cache with network_info: [{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.483 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121214.482156, f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.483 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] VM Started (Lifecycle Event)
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.489 182729 DEBUG oslo_concurrency.lockutils [req-937dab14-aeee-4961-a9f8-81f639f30019 req-899dbcb0-004c-4676-b14b-b544c7412433 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.502 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.506 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121214.4823518, f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.506 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] VM Paused (Lifecycle Event)
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.526 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.530 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:33:34 compute-0 nova_compute[182725]: 2026-01-22 22:33:34.550 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:33:34 compute-0 podman[224889]: 2026-01-22 22:33:34.616343462 +0000 UTC m=+0.045656781 container create 5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:33:34 compute-0 systemd[1]: Started libpod-conmon-5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6.scope.
Jan 22 22:33:34 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:33:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6975bf9778786734282f6e52ec152e755d5f0381692b973f056ca510f692e011/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:33:34 compute-0 podman[224889]: 2026-01-22 22:33:34.592251015 +0000 UTC m=+0.021564354 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:33:34 compute-0 podman[224889]: 2026-01-22 22:33:34.69251863 +0000 UTC m=+0.121831969 container init 5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:33:34 compute-0 podman[224889]: 2026-01-22 22:33:34.697581307 +0000 UTC m=+0.126894626 container start 5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:33:34 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [NOTICE]   (224909) : New worker (224911) forked
Jan 22 22:33:34 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [NOTICE]   (224909) : Loading success.
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.770 182729 DEBUG nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.772 182729 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.772 182729 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.773 182729 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.773 182729 DEBUG nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Processing event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.774 182729 DEBUG nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.774 182729 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.775 182729 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.775 182729 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.776 182729 DEBUG nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] No waiting events found dispatching network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.776 182729 WARNING nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received unexpected event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for instance with vm_state building and task_state spawning.
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.777 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.782 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121215.7824495, f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.783 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] VM Resumed (Lifecycle Event)
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.786 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.793 182729 INFO nova.virt.libvirt.driver [-] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Instance spawned successfully.
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.794 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.804 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.810 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.824 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.825 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.826 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.827 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.828 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.829 182729 DEBUG nova.virt.libvirt.driver [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.838 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.913 182729 INFO nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Took 6.36 seconds to spawn the instance on the hypervisor.
Jan 22 22:33:35 compute-0 nova_compute[182725]: 2026-01-22 22:33:35.914 182729 DEBUG nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:33:36 compute-0 nova_compute[182725]: 2026-01-22 22:33:36.018 182729 INFO nova.compute.manager [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Took 6.95 seconds to build instance.
Jan 22 22:33:36 compute-0 nova_compute[182725]: 2026-01-22 22:33:36.037 182729 DEBUG oslo_concurrency.lockutils [None req-0366512d-6400-49c0-8c57-ed107121022f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:36 compute-0 nova_compute[182725]: 2026-01-22 22:33:36.465 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:38 compute-0 nova_compute[182725]: 2026-01-22 22:33:38.167 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:38 compute-0 nova_compute[182725]: 2026-01-22 22:33:38.997 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:43 compute-0 nova_compute[182725]: 2026-01-22 22:33:43.110 182729 DEBUG nova.compute.manager [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-changed-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:33:43 compute-0 nova_compute[182725]: 2026-01-22 22:33:43.111 182729 DEBUG nova.compute.manager [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Refreshing instance network info cache due to event network-changed-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:33:43 compute-0 nova_compute[182725]: 2026-01-22 22:33:43.111 182729 DEBUG oslo_concurrency.lockutils [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:33:43 compute-0 nova_compute[182725]: 2026-01-22 22:33:43.111 182729 DEBUG oslo_concurrency.lockutils [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:33:43 compute-0 nova_compute[182725]: 2026-01-22 22:33:43.112 182729 DEBUG nova.network.neutron [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Refreshing network info cache for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:33:43 compute-0 nova_compute[182725]: 2026-01-22 22:33:43.171 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:43 compute-0 nova_compute[182725]: 2026-01-22 22:33:43.998 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:45 compute-0 nova_compute[182725]: 2026-01-22 22:33:45.235 182729 DEBUG nova.network.neutron [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updated VIF entry in instance network info cache for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:33:45 compute-0 nova_compute[182725]: 2026-01-22 22:33:45.235 182729 DEBUG nova.network.neutron [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updating instance_info_cache with network_info: [{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:33:45 compute-0 nova_compute[182725]: 2026-01-22 22:33:45.255 182729 DEBUG oslo_concurrency.lockutils [req-2a100843-08bd-4a02-a7d2-6ee82739e691 req-df2da01a-a016-4241-9545-7a46b54dce2e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:33:46 compute-0 ovn_controller[94850]: 2026-01-22T22:33:46Z|00360|binding|INFO|Releasing lport bd5d6f7a-cf06-4349-aba6-14eec937270d from this chassis (sb_readonly=0)
Jan 22 22:33:46 compute-0 ovn_controller[94850]: 2026-01-22T22:33:46Z|00361|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:33:46 compute-0 nova_compute[182725]: 2026-01-22 22:33:46.776 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:47 compute-0 ovn_controller[94850]: 2026-01-22T22:33:47Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:b8:63 10.100.0.11
Jan 22 22:33:47 compute-0 ovn_controller[94850]: 2026-01-22T22:33:47Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:b8:63 10.100.0.11
Jan 22 22:33:47 compute-0 podman[224936]: 2026-01-22 22:33:47.568892371 +0000 UTC m=+0.068904326 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:33:48 compute-0 nova_compute[182725]: 2026-01-22 22:33:48.174 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:49 compute-0 nova_compute[182725]: 2026-01-22 22:33:49.000 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:52 compute-0 podman[224959]: 2026-01-22 22:33:52.154450058 +0000 UTC m=+0.086919200 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 22 22:33:52 compute-0 podman[224960]: 2026-01-22 22:33:52.175712313 +0000 UTC m=+0.096376427 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 22 22:33:52 compute-0 nova_compute[182725]: 2026-01-22 22:33:52.455 182729 DEBUG oslo_concurrency.lockutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:33:52 compute-0 nova_compute[182725]: 2026-01-22 22:33:52.456 182729 DEBUG oslo_concurrency.lockutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:33:52 compute-0 nova_compute[182725]: 2026-01-22 22:33:52.457 182729 DEBUG nova.network.neutron [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:33:52 compute-0 nova_compute[182725]: 2026-01-22 22:33:52.462 182729 INFO nova.compute.manager [None req-7364f93f-5b1d-4614-989c-50bd5e0b7b52 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Get console output
Jan 22 22:33:52 compute-0 nova_compute[182725]: 2026-01-22 22:33:52.467 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:33:53 compute-0 nova_compute[182725]: 2026-01-22 22:33:53.179 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.003 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.666 182729 DEBUG nova.network.neutron [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.685 182729 DEBUG oslo_concurrency.lockutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.863 182729 DEBUG nova.virt.libvirt.driver [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.864 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Creating file /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/da3c46f3b58343c4b2b872d2f69406d9.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.864 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/da3c46f3b58343c4b2b872d2f69406d9.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.961 182729 INFO nova.compute.manager [None req-1398e630-4d0a-4aac-9473-7d3d9fb055fa 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Get console output
Jan 22 22:33:54 compute-0 nova_compute[182725]: 2026-01-22 22:33:54.966 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:33:55 compute-0 nova_compute[182725]: 2026-01-22 22:33:55.350 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/da3c46f3b58343c4b2b872d2f69406d9.tmp" returned: 1 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:55 compute-0 nova_compute[182725]: 2026-01-22 22:33:55.350 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/da3c46f3b58343c4b2b872d2f69406d9.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 22:33:55 compute-0 nova_compute[182725]: 2026-01-22 22:33:55.351 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Creating directory /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 22:33:55 compute-0 nova_compute[182725]: 2026-01-22 22:33:55.352 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:55 compute-0 nova_compute[182725]: 2026-01-22 22:33:55.586 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:55 compute-0 nova_compute[182725]: 2026-01-22 22:33:55.591 182729 DEBUG nova.virt.libvirt.driver [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:33:55 compute-0 nova_compute[182725]: 2026-01-22 22:33:55.984 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:57 compute-0 kernel: tapb56a4401-4c (unregistering): left promiscuous mode
Jan 22 22:33:57 compute-0 NetworkManager[54954]: <info>  [1769121237.7767] device (tapb56a4401-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:33:57 compute-0 nova_compute[182725]: 2026-01-22 22:33:57.787 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:57 compute-0 ovn_controller[94850]: 2026-01-22T22:33:57Z|00362|binding|INFO|Releasing lport b56a4401-4c89-482a-a347-ca080a879f8f from this chassis (sb_readonly=0)
Jan 22 22:33:57 compute-0 ovn_controller[94850]: 2026-01-22T22:33:57Z|00363|binding|INFO|Setting lport b56a4401-4c89-482a-a347-ca080a879f8f down in Southbound
Jan 22 22:33:57 compute-0 ovn_controller[94850]: 2026-01-22T22:33:57Z|00364|binding|INFO|Removing iface tapb56a4401-4c ovn-installed in OVS
Jan 22 22:33:57 compute-0 nova_compute[182725]: 2026-01-22 22:33:57.794 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:57.800 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:b5:89 10.100.0.9'], port_security=['fa:16:3e:f0:b5:89 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '454ec87b-a45c-40af-8bce-d252eea19620', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b56a4401-4c89-482a-a347-ca080a879f8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:33:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:57.802 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b56a4401-4c89-482a-a347-ca080a879f8f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:33:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:57.804 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:33:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:57.805 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[394c44b1-16fd-4873-ae51-e1e633e92e08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:57.806 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:33:57 compute-0 nova_compute[182725]: 2026-01-22 22:33:57.810 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:57 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 22 22:33:57 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000057.scope: Consumed 16.356s CPU time.
Jan 22 22:33:57 compute-0 systemd-machined[154006]: Machine qemu-40-instance-00000057 terminated.
Jan 22 22:33:57 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[224269]: [NOTICE]   (224273) : haproxy version is 2.8.14-c23fe91
Jan 22 22:33:57 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[224269]: [NOTICE]   (224273) : path to executable is /usr/sbin/haproxy
Jan 22 22:33:57 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[224269]: [WARNING]  (224273) : Exiting Master process...
Jan 22 22:33:57 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[224269]: [ALERT]    (224273) : Current worker (224275) exited with code 143 (Terminated)
Jan 22 22:33:57 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[224269]: [WARNING]  (224273) : All workers exited. Exiting... (0)
Jan 22 22:33:57 compute-0 systemd[1]: libpod-ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b.scope: Deactivated successfully.
Jan 22 22:33:57 compute-0 podman[225029]: 2026-01-22 22:33:57.947658983 +0000 UTC m=+0.043048925 container died ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:33:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b-userdata-shm.mount: Deactivated successfully.
Jan 22 22:33:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-e080680cc5f28728dc0d89d678f10cd8eac5e11c27d0168116f25ddc5eddb875-merged.mount: Deactivated successfully.
Jan 22 22:33:57 compute-0 podman[225029]: 2026-01-22 22:33:57.98882999 +0000 UTC m=+0.084219932 container cleanup ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:33:57 compute-0 systemd[1]: libpod-conmon-ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b.scope: Deactivated successfully.
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.016 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.021 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 podman[225058]: 2026-01-22 22:33:58.05794115 +0000 UTC m=+0.046109232 container remove ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.064 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c14aab05-6295-4018-822b-1c941c0c6c0d]: (4, ('Thu Jan 22 10:33:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b)\nae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b\nThu Jan 22 10:33:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (ae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b)\nae36c524d4c8fba7eb991397e9a39cbc4ab6d52805f034c16a72f66fa7a7328b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.066 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[96ed53dd-8c63-48d8-93aa-b98752146205]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.067 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:58 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.069 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.089 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.092 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6b781f77-9776-42bf-9fed-4858fa0b1191]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.104 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c01bdd0a-4cb9-45fa-84da-2ffcd9d36118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.105 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8d08b1-0c2a-4ba8-a4d7-9a1c2c005ac7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.122 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5c1b0b-2815-4280-a84d-976d60dd120f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479100, 'reachable_time': 44255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225089, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.124 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:33:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:33:58.124 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[8966baf2-4306-4d08-a92b-996e648a9c0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:33:58 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.181 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.611 182729 INFO nova.virt.libvirt.driver [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance shutdown successfully after 3 seconds.
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.616 182729 INFO nova.virt.libvirt.driver [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Instance destroyed successfully.
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.617 182729 DEBUG nova.virt.libvirt.vif [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:33:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:f0:b5:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.617 182729 DEBUG nova.network.os_vif_util [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:f0:b5:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.618 182729 DEBUG nova.network.os_vif_util [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.618 182729 DEBUG os_vif [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.620 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.621 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb56a4401-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.622 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.624 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.626 182729 INFO os_vif [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.630 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.691 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.693 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.771 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.772 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Copying file /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk to 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.773 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.847 182729 DEBUG oslo_concurrency.lockutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.847 182729 DEBUG oslo_concurrency.lockutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquired lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:33:58 compute-0 nova_compute[182725]: 2026-01-22 22:33:58.848 182729 DEBUG nova.network.neutron [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.006 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.097 182729 DEBUG nova.compute.manager [req-78a8a598-b85b-400e-8c5b-5b041936a2c0 req-a9953ee7-cecf-4476-9bab-f8b07d4a480c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.097 182729 DEBUG oslo_concurrency.lockutils [req-78a8a598-b85b-400e-8c5b-5b041936a2c0 req-a9953ee7-cecf-4476-9bab-f8b07d4a480c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.098 182729 DEBUG oslo_concurrency.lockutils [req-78a8a598-b85b-400e-8c5b-5b041936a2c0 req-a9953ee7-cecf-4476-9bab-f8b07d4a480c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.099 182729 DEBUG oslo_concurrency.lockutils [req-78a8a598-b85b-400e-8c5b-5b041936a2c0 req-a9953ee7-cecf-4476-9bab-f8b07d4a480c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.100 182729 DEBUG nova.compute.manager [req-78a8a598-b85b-400e-8c5b-5b041936a2c0 req-a9953ee7-cecf-4476-9bab-f8b07d4a480c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.100 182729 WARNING nova.compute.manager [req-78a8a598-b85b-400e-8c5b-5b041936a2c0 req-a9953ee7-cecf-4476-9bab-f8b07d4a480c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-unplugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state resize_migrating.
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.201 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.345 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "scp -r /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk" returned: 0 in 0.572s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.346 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Copying file /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.346 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk.config 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.632 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "scp -C -r /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk.config 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.config" returned: 0 in 0.285s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.634 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Copying file /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.634 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk.info 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:33:59 compute-0 nova_compute[182725]: 2026-01-22 22:33:59.890 182729 DEBUG oslo_concurrency.processutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "scp -C -r /var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620_resize/disk.info 192.168.122.101:/var/lib/nova/instances/454ec87b-a45c-40af-8bce-d252eea19620/disk.info" returned: 0 in 0.255s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.183 182729 DEBUG nova.compute.manager [req-49ff0c6e-eb3f-4b6c-884a-d1f6245f5d3f req-f97ee187-dbb9-43b5-9221-53846988c4e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.184 182729 DEBUG oslo_concurrency.lockutils [req-49ff0c6e-eb3f-4b6c-884a-d1f6245f5d3f req-f97ee187-dbb9-43b5-9221-53846988c4e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.184 182729 DEBUG oslo_concurrency.lockutils [req-49ff0c6e-eb3f-4b6c-884a-d1f6245f5d3f req-f97ee187-dbb9-43b5-9221-53846988c4e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.184 182729 DEBUG oslo_concurrency.lockutils [req-49ff0c6e-eb3f-4b6c-884a-d1f6245f5d3f req-f97ee187-dbb9-43b5-9221-53846988c4e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.185 182729 DEBUG nova.compute.manager [req-49ff0c6e-eb3f-4b6c-884a-d1f6245f5d3f req-f97ee187-dbb9-43b5-9221-53846988c4e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.185 182729 WARNING nova.compute.manager [req-49ff0c6e-eb3f-4b6c-884a-d1f6245f5d3f req-f97ee187-dbb9-43b5-9221-53846988c4e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state active and task_state resize_migrating.
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.580 182729 DEBUG neutronclient.v2_0.client [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b56a4401-4c89-482a-a347-ca080a879f8f for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.702 182729 DEBUG oslo_concurrency.lockutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.703 182729 DEBUG oslo_concurrency.lockutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:01 compute-0 nova_compute[182725]: 2026-01-22 22:34:01.703 182729 DEBUG oslo_concurrency.lockutils [None req-d52d1442-4e38-47f7-91d7-3d8105a0de8e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:02 compute-0 podman[225103]: 2026-01-22 22:34:02.131005542 +0000 UTC m=+0.048902733 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:34:02 compute-0 podman[225102]: 2026-01-22 22:34:02.131017412 +0000 UTC m=+0.050818871 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 22:34:03 compute-0 nova_compute[182725]: 2026-01-22 22:34:03.624 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:03 compute-0 nova_compute[182725]: 2026-01-22 22:34:03.728 182729 DEBUG nova.network.neutron [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updating instance_info_cache with network_info: [{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:03 compute-0 nova_compute[182725]: 2026-01-22 22:34:03.766 182729 DEBUG oslo_concurrency.lockutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Releasing lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:03 compute-0 nova_compute[182725]: 2026-01-22 22:34:03.985 182729 DEBUG nova.virt.libvirt.driver [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 22:34:03 compute-0 nova_compute[182725]: 2026-01-22 22:34:03.986 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Creating file /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/3af218dbcfeb41b18daee282a4943431.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 22:34:03 compute-0 nova_compute[182725]: 2026-01-22 22:34:03.987 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/3af218dbcfeb41b18daee282a4943431.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:04.000 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:34:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:04.002 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.012 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:04 compute-0 podman[225146]: 2026-01-22 22:34:04.137058556 +0000 UTC m=+0.076627651 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.239 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/3af218dbcfeb41b18daee282a4943431.tmp" returned: 1 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.240 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/3af218dbcfeb41b18daee282a4943431.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.240 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Creating directory /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.241 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.406 182729 DEBUG nova.compute.manager [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-changed-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.407 182729 DEBUG nova.compute.manager [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Refreshing instance network info cache due to event network-changed-b56a4401-4c89-482a-a347-ca080a879f8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.407 182729 DEBUG oslo_concurrency.lockutils [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.408 182729 DEBUG oslo_concurrency.lockutils [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.408 182729 DEBUG nova.network.neutron [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Refreshing network info cache for port b56a4401-4c89-482a-a347-ca080a879f8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.474 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:04 compute-0 nova_compute[182725]: 2026-01-22 22:34:04.481 182729 DEBUG nova.virt.libvirt.driver [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.202 182729 DEBUG nova.network.neutron [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updated VIF entry in instance network info cache for port b56a4401-4c89-482a-a347-ca080a879f8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.203 182729 DEBUG nova.network.neutron [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:06 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:34:06 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.221 182729 DEBUG oslo_concurrency.lockutils [req-ee60a8b3-d1d6-4173-8c7d-d28d1a3cd7f9 req-3b0a9736-90e4-4f08-8964-49adfe1fa360 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:06 compute-0 kernel: tap3337d5f2-1b (unregistering): left promiscuous mode
Jan 22 22:34:06 compute-0 NetworkManager[54954]: <info>  [1769121246.6543] device (tap3337d5f2-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:34:06 compute-0 ovn_controller[94850]: 2026-01-22T22:34:06Z|00365|binding|INFO|Releasing lport 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf from this chassis (sb_readonly=0)
Jan 22 22:34:06 compute-0 ovn_controller[94850]: 2026-01-22T22:34:06Z|00366|binding|INFO|Setting lport 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf down in Southbound
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.655 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:06 compute-0 ovn_controller[94850]: 2026-01-22T22:34:06Z|00367|binding|INFO|Removing iface tap3337d5f2-1b ovn-installed in OVS
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.670 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:b8:63 10.100.0.11'], port_security=['fa:16:3e:e9:b8:63 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41d14a29-e317-41c8-873b-57cd74687162', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a562ab8-d881-4671-b45c-5e728b84d5af, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.674 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf in datapath ab1e30f5-371b-4049-9721-63ec7c0c03c3 unbound from our chassis
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.674 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.677 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ab1e30f5-371b-4049-9721-63ec7c0c03c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.679 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4a564136-41ad-44ad-931a-2b7941bf93f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.680 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3 namespace which is not needed anymore
Jan 22 22:34:06 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000065.scope: Deactivated successfully.
Jan 22 22:34:06 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000065.scope: Consumed 13.481s CPU time.
Jan 22 22:34:06 compute-0 systemd-machined[154006]: Machine qemu-42-instance-00000065 terminated.
Jan 22 22:34:06 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [NOTICE]   (224909) : haproxy version is 2.8.14-c23fe91
Jan 22 22:34:06 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [NOTICE]   (224909) : path to executable is /usr/sbin/haproxy
Jan 22 22:34:06 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [WARNING]  (224909) : Exiting Master process...
Jan 22 22:34:06 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [WARNING]  (224909) : Exiting Master process...
Jan 22 22:34:06 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [ALERT]    (224909) : Current worker (224911) exited with code 143 (Terminated)
Jan 22 22:34:06 compute-0 neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3[224905]: [WARNING]  (224909) : All workers exited. Exiting... (0)
Jan 22 22:34:06 compute-0 systemd[1]: libpod-5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6.scope: Deactivated successfully.
Jan 22 22:34:06 compute-0 podman[225199]: 2026-01-22 22:34:06.824982287 +0000 UTC m=+0.049799495 container died 5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 22:34:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6-userdata-shm.mount: Deactivated successfully.
Jan 22 22:34:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-6975bf9778786734282f6e52ec152e755d5f0381692b973f056ca510f692e011-merged.mount: Deactivated successfully.
Jan 22 22:34:06 compute-0 podman[225199]: 2026-01-22 22:34:06.856445119 +0000 UTC m=+0.081262317 container cleanup 5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 22:34:06 compute-0 systemd[1]: libpod-conmon-5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6.scope: Deactivated successfully.
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.889 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.894 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:06 compute-0 podman[225226]: 2026-01-22 22:34:06.925484018 +0000 UTC m=+0.043970768 container remove 5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.931 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2a698f77-cc97-4e40-bbe4-36eb4208866b]: (4, ('Thu Jan 22 10:34:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3 (5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6)\n5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6\nThu Jan 22 10:34:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3 (5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6)\n5e133996bb65d0238e7ba5ba3eaff4858dbe7678f854219a8f70ea12fbde64e6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.933 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1acabfe4-e49f-4638-9fa6-24a4dbad9292]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.934 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab1e30f5-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.936 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:06 compute-0 kernel: tapab1e30f5-30: left promiscuous mode
Jan 22 22:34:06 compute-0 nova_compute[182725]: 2026-01-22 22:34:06.953 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.957 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[237b85c8-1831-4c7d-9df9-0381fa6720be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.969 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[61dc203a-537d-41d6-a0e5-7788516ea93b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.971 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1ed37c-f5ae-4107-912a-9824864fff0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.989 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d748a5-44e7-4644-86f2-44b3a781190c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485152, 'reachable_time': 23182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225259, 'error': None, 'target': 'ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:06 compute-0 systemd[1]: run-netns-ovnmeta\x2dab1e30f5\x2d371b\x2d4049\x2d9721\x2d63ec7c0c03c3.mount: Deactivated successfully.
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.994 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ab1e30f5-371b-4049-9721-63ec7c0c03c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:34:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:06.994 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[09edb10e-0ca8-45f3-aa4b-ec3bffb3eff9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.502 182729 INFO nova.virt.libvirt.driver [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Instance shutdown successfully after 3 seconds.
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.511 182729 INFO nova.virt.libvirt.driver [-] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Instance destroyed successfully.
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.512 182729 DEBUG nova.virt.libvirt.vif [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:33:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1025419542',display_name='tempest-TestNetworkAdvancedServerOps-server-1025419542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1025419542',id=101,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEus4xpeVU6UA/n/yQy1kMAIvd44vHu8JBv/r1ml4Jz/KGNF+fk3mgNhSfFiD7I2hzAvoBsUSNaBpot5THfUU39PkBt2NJWCup0GtX5C21HXa7nrqKOdEjlhp77y02K9rA==',key_name='tempest-TestNetworkAdvancedServerOps-1121920412',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:33:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-rq8sd9ww',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:33:58Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1969906854", "vif_mac": "fa:16:3e:e9:b8:63"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.512 182729 DEBUG nova.network.os_vif_util [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converting VIF {"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1969906854", "vif_mac": "fa:16:3e:e9:b8:63"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.513 182729 DEBUG nova.network.os_vif_util [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.514 182729 DEBUG os_vif [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.516 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.516 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3337d5f2-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.517 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.519 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.522 182729 INFO os_vif [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b')
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.526 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.587 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.590 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.654 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.656 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Copying file /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk to 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.656 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.763 182729 DEBUG nova.compute.manager [req-bb8f8ba8-7ad2-4f2e-b759-fc314ead35d9 req-159334d5-698a-4e9b-96d8-f9b41b3523bb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-vif-unplugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.765 182729 DEBUG oslo_concurrency.lockutils [req-bb8f8ba8-7ad2-4f2e-b759-fc314ead35d9 req-159334d5-698a-4e9b-96d8-f9b41b3523bb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.765 182729 DEBUG oslo_concurrency.lockutils [req-bb8f8ba8-7ad2-4f2e-b759-fc314ead35d9 req-159334d5-698a-4e9b-96d8-f9b41b3523bb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.766 182729 DEBUG oslo_concurrency.lockutils [req-bb8f8ba8-7ad2-4f2e-b759-fc314ead35d9 req-159334d5-698a-4e9b-96d8-f9b41b3523bb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.766 182729 DEBUG nova.compute.manager [req-bb8f8ba8-7ad2-4f2e-b759-fc314ead35d9 req-159334d5-698a-4e9b-96d8-f9b41b3523bb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] No waiting events found dispatching network-vif-unplugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:07 compute-0 nova_compute[182725]: 2026-01-22 22:34:07.767 182729 WARNING nova.compute.manager [req-bb8f8ba8-7ad2-4f2e-b759-fc314ead35d9 req-159334d5-698a-4e9b-96d8-f9b41b3523bb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received unexpected event network-vif-unplugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for instance with vm_state active and task_state resize_migrating.
Jan 22 22:34:08 compute-0 nova_compute[182725]: 2026-01-22 22:34:08.248 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "scp -r /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk" returned: 0 in 0.592s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:08 compute-0 nova_compute[182725]: 2026-01-22 22:34:08.249 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Copying file /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:34:08 compute-0 nova_compute[182725]: 2026-01-22 22:34:08.250 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk.config 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:08 compute-0 nova_compute[182725]: 2026-01-22 22:34:08.533 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "scp -C -r /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk.config 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.config" returned: 0 in 0.284s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:08 compute-0 nova_compute[182725]: 2026-01-22 22:34:08.535 182729 DEBUG nova.virt.libvirt.volume.remotefs [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Copying file /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 22:34:08 compute-0 nova_compute[182725]: 2026-01-22 22:34:08.535 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk.info 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:08 compute-0 nova_compute[182725]: 2026-01-22 22:34:08.810 182729 DEBUG oslo_concurrency.processutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "scp -C -r /var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75_resize/disk.info 192.168.122.101:/var/lib/nova/instances/f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75/disk.info" returned: 0 in 0.275s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:09 compute-0 nova_compute[182725]: 2026-01-22 22:34:09.011 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:09 compute-0 nova_compute[182725]: 2026-01-22 22:34:09.028 182729 DEBUG neutronclient.v2_0.client [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.111 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1025419542', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000065', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '839eb51e89b14157b8da40ae1b480ef3', 'user_id': '80fc173d19874dafa5e0cbd18c7ccf24', 'hostId': 'a22eb6b98314bb9ebe332b8461828f2ec243936a1e7cd368b4505f50', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.114 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '454ec87b-a45c-40af-8bce-d252eea19620', 'name': 'tempest-ServerActionsTestJSON-server-256562799', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000057', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '301c97a097c64afd8d55adb73fdd8cce', 'user_id': '97ae504d8c4f43529c360266766791d0', 'hostId': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.115 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.116 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.117 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.117 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.118 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.118 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.119 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.119 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.120 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.121 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.121 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.122 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.123 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.124 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.125 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.126 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.127 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.128 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.128 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.128 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.129 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.130 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.130 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.131 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.132 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.133 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.133 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.134 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.135 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.136 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.136 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.137 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.138 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.138 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.138 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>]
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.139 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.140 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.140 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.140 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>]
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.141 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.141 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.142 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.143 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.143 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.143 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.144 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.144 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.144 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>]
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.145 12 DEBUG ceilometer.compute.pollsters [-] Instance f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000065, id=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.145 12 DEBUG ceilometer.compute.pollsters [-] Instance 454ec87b-a45c-40af-8bce-d252eea19620 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000057, id=454ec87b-a45c-40af-8bce-d252eea19620>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.146 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:34:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:34:09.146 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1025419542>]
Jan 22 22:34:09 compute-0 nova_compute[182725]: 2026-01-22 22:34:09.156 182729 DEBUG oslo_concurrency.lockutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:09 compute-0 nova_compute[182725]: 2026-01-22 22:34:09.156 182729 DEBUG oslo_concurrency.lockutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:09 compute-0 nova_compute[182725]: 2026-01-22 22:34:09.157 182729 DEBUG oslo_concurrency.lockutils [None req-df3e65f9-79c6-4951-8afa-2dc7fa99cf19 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:10.004 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.003 182729 DEBUG nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.004 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.004 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.005 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.005 182729 DEBUG nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] No waiting events found dispatching network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.005 182729 WARNING nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received unexpected event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for instance with vm_state active and task_state resize_migrated.
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.006 182729 DEBUG nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.006 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.006 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.006 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.007 182729 DEBUG nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.007 182729 WARNING nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state resized and task_state None.
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.007 182729 DEBUG nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.008 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.008 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.008 182729 DEBUG oslo_concurrency.lockutils [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.008 182729 DEBUG nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] No waiting events found dispatching network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.009 182729 WARNING nova.compute.manager [req-5cf856f3-3a25-4364-9cad-93d0f2f01a7a req-1c208d91-4345-47b2-81dd-e7e16a7e149a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Received unexpected event network-vif-plugged-b56a4401-4c89-482a-a347-ca080a879f8f for instance with vm_state resized and task_state None.
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.202 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "454ec87b-a45c-40af-8bce-d252eea19620" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.203 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.203 182729 DEBUG nova.compute.manager [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Going to confirm migration 14 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.248 182729 DEBUG nova.objects.instance [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'info_cache' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.886 182729 DEBUG neutronclient.v2_0.client [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b56a4401-4c89-482a-a347-ca080a879f8f for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.887 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.887 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:10 compute-0 nova_compute[182725]: 2026-01-22 22:34:10.887 182729 DEBUG nova.network.neutron [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:34:11 compute-0 nova_compute[182725]: 2026-01-22 22:34:11.193 182729 DEBUG nova.compute.manager [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-changed-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:11 compute-0 nova_compute[182725]: 2026-01-22 22:34:11.194 182729 DEBUG nova.compute.manager [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Refreshing instance network info cache due to event network-changed-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:34:11 compute-0 nova_compute[182725]: 2026-01-22 22:34:11.194 182729 DEBUG oslo_concurrency.lockutils [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:11 compute-0 nova_compute[182725]: 2026-01-22 22:34:11.195 182729 DEBUG oslo_concurrency.lockutils [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:11 compute-0 nova_compute[182725]: 2026-01-22 22:34:11.195 182729 DEBUG nova.network.neutron [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Refreshing network info cache for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:34:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:12.442 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:12.443 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:12.443 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:12 compute-0 nova_compute[182725]: 2026-01-22 22:34:12.518 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.051 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121238.0499446, 454ec87b-a45c-40af-8bce-d252eea19620 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.052 182729 INFO nova.compute.manager [-] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] VM Stopped (Lifecycle Event)
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.073 182729 DEBUG nova.compute.manager [None req-abd60db3-79fe-4c8c-a39a-41bb4de4a6e4 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.078 182729 DEBUG nova.compute.manager [None req-abd60db3-79fe-4c8c-a39a-41bb4de4a6e4 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.095 182729 INFO nova.compute.manager [None req-abd60db3-79fe-4c8c-a39a-41bb4de4a6e4 - - - - - -] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.285 182729 DEBUG nova.network.neutron [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: 454ec87b-a45c-40af-8bce-d252eea19620] Updating instance_info_cache with network_info: [{"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.312 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-454ec87b-a45c-40af-8bce-d252eea19620" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.313 182729 DEBUG nova.objects.instance [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid 454ec87b-a45c-40af-8bce-d252eea19620 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.348 182729 DEBUG nova.virt.libvirt.vif [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-256562799',display_name='tempest-ServerActionsTestJSON-server-256562799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-256562799',id=87,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-ub4wld9l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:34:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=454ec87b-a45c-40af-8bce-d252eea19620,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.348 182729 DEBUG nova.network.os_vif_util [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "b56a4401-4c89-482a-a347-ca080a879f8f", "address": "fa:16:3e:f0:b5:89", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb56a4401-4c", "ovs_interfaceid": "b56a4401-4c89-482a-a347-ca080a879f8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.349 182729 DEBUG nova.network.os_vif_util [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.349 182729 DEBUG os_vif [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.352 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.352 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb56a4401-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.352 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.355 182729 INFO os_vif [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:b5:89,bridge_name='br-int',has_traffic_filtering=True,id=b56a4401-4c89-482a-a347-ca080a879f8f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb56a4401-4c')
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.355 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.355 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.558 182729 DEBUG nova.compute.provider_tree [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.571 182729 DEBUG nova.scheduler.client.report [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.611 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.719 182729 DEBUG nova.network.neutron [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updated VIF entry in instance network info cache for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.720 182729 DEBUG nova.network.neutron [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updating instance_info_cache with network_info: [{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.757 182729 DEBUG oslo_concurrency.lockutils [req-f1a4a1e5-7c55-4acf-a115-53c26d53a98b req-49d8364c-69ae-48ca-9537-dd5027a7b33f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:13 compute-0 nova_compute[182725]: 2026-01-22 22:34:13.764 182729 INFO nova.scheduler.client.report [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Deleted allocation for migration 9d58a673-9b11-4e5e-a6f4-33b24a6148ac
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.013 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.089 182729 DEBUG oslo_concurrency.lockutils [None req-fbd64262-a4e6-40bc-94d1-49ba49e58bc7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "454ec87b-a45c-40af-8bce-d252eea19620" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.886s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.924 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.925 182729 DEBUG nova.compute.manager [req-2fc3687f-47b2-4adf-aa28-722ee39f3727 req-cdbddf1e-5af6-4a7a-9e26-55a2a8d71852 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.926 182729 DEBUG oslo_concurrency.lockutils [req-2fc3687f-47b2-4adf-aa28-722ee39f3727 req-cdbddf1e-5af6-4a7a-9e26-55a2a8d71852 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.926 182729 DEBUG oslo_concurrency.lockutils [req-2fc3687f-47b2-4adf-aa28-722ee39f3727 req-cdbddf1e-5af6-4a7a-9e26-55a2a8d71852 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.926 182729 DEBUG oslo_concurrency.lockutils [req-2fc3687f-47b2-4adf-aa28-722ee39f3727 req-cdbddf1e-5af6-4a7a-9e26-55a2a8d71852 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.927 182729 DEBUG nova.compute.manager [req-2fc3687f-47b2-4adf-aa28-722ee39f3727 req-cdbddf1e-5af6-4a7a-9e26-55a2a8d71852 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] No waiting events found dispatching network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:14 compute-0 nova_compute[182725]: 2026-01-22 22:34:14.927 182729 WARNING nova.compute.manager [req-2fc3687f-47b2-4adf-aa28-722ee39f3727 req-cdbddf1e-5af6-4a7a-9e26-55a2a8d71852 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received unexpected event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for instance with vm_state active and task_state resize_finish.
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.045 182729 DEBUG nova.compute.manager [req-4086fc6c-aca9-4623-b49f-2ca0c3a98ca9 req-f1a50656-4edb-42ec-99fc-4daec71c0aff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.045 182729 DEBUG oslo_concurrency.lockutils [req-4086fc6c-aca9-4623-b49f-2ca0c3a98ca9 req-f1a50656-4edb-42ec-99fc-4daec71c0aff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.045 182729 DEBUG oslo_concurrency.lockutils [req-4086fc6c-aca9-4623-b49f-2ca0c3a98ca9 req-f1a50656-4edb-42ec-99fc-4daec71c0aff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.046 182729 DEBUG oslo_concurrency.lockutils [req-4086fc6c-aca9-4623-b49f-2ca0c3a98ca9 req-f1a50656-4edb-42ec-99fc-4daec71c0aff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.046 182729 DEBUG nova.compute.manager [req-4086fc6c-aca9-4623-b49f-2ca0c3a98ca9 req-f1a50656-4edb-42ec-99fc-4daec71c0aff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] No waiting events found dispatching network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.046 182729 WARNING nova.compute.manager [req-4086fc6c-aca9-4623-b49f-2ca0c3a98ca9 req-f1a50656-4edb-42ec-99fc-4daec71c0aff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Received unexpected event network-vif-plugged-3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for instance with vm_state resized and task_state None.
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.130 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.130 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.130 182729 DEBUG nova.compute.manager [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Going to confirm migration 15 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.177 182729 DEBUG nova.objects.instance [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'info_cache' on Instance uuid f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.521 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.980 182729 DEBUG neutronclient.v2_0.client [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.980 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.980 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:17 compute-0 nova_compute[182725]: 2026-01-22 22:34:17.981 182729 DEBUG nova.network.neutron [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:34:18 compute-0 podman[225272]: 2026-01-22 22:34:18.121166901 +0000 UTC m=+0.057412917 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.016 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.563 182729 DEBUG nova.network.neutron [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Updating instance_info_cache with network_info: [{"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.579 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.580 182729 DEBUG nova.objects.instance [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.607 182729 DEBUG nova.virt.libvirt.vif [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:33:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1025419542',display_name='tempest-TestNetworkAdvancedServerOps-server-1025419542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1025419542',id=101,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEus4xpeVU6UA/n/yQy1kMAIvd44vHu8JBv/r1ml4Jz/KGNF+fk3mgNhSfFiD7I2hzAvoBsUSNaBpot5THfUU39PkBt2NJWCup0GtX5C21HXa7nrqKOdEjlhp77y02K9rA==',key_name='tempest-TestNetworkAdvancedServerOps-1121920412',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-rq8sd9ww',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:34:15Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.607 182729 DEBUG nova.network.os_vif_util [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "address": "fa:16:3e:e9:b8:63", "network": {"id": "ab1e30f5-371b-4049-9721-63ec7c0c03c3", "bridge": "br-int", "label": "tempest-network-smoke--1969906854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3337d5f2-1b", "ovs_interfaceid": "3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.608 182729 DEBUG nova.network.os_vif_util [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.608 182729 DEBUG os_vif [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.609 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.610 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3337d5f2-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.610 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.612 182729 INFO os_vif [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:b8:63,bridge_name='br-int',has_traffic_filtering=True,id=3337d5f2-1b98-4d55-8cd9-4ddbf11c0bdf,network=Network(ab1e30f5-371b-4049-9721-63ec7c0c03c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3337d5f2-1b')
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.612 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.612 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.692 182729 DEBUG nova.compute.provider_tree [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.709 182729 DEBUG nova.scheduler.client.report [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.764 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.911 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.954 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.955 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.955 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:19 compute-0 nova_compute[182725]: 2026-01-22 22:34:19.955 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.001 182729 INFO nova.scheduler.client.report [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Deleted allocation for migration d561ea98-e1a2-411e-bdc7-bba5a839f4ea
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.148 182729 DEBUG oslo_concurrency.lockutils [None req-c39bbabb-af15-472b-9139-5d7837e38c8f 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.151 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.152 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5648MB free_disk=73.36791610717773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.153 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.153 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.200 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.201 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.225 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.239 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.260 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:34:20 compute-0 nova_compute[182725]: 2026-01-22 22:34:20.261 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:21 compute-0 nova_compute[182725]: 2026-01-22 22:34:21.930 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121246.9283636, f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:21 compute-0 nova_compute[182725]: 2026-01-22 22:34:21.930 182729 INFO nova.compute.manager [-] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] VM Stopped (Lifecycle Event)
Jan 22 22:34:21 compute-0 nova_compute[182725]: 2026-01-22 22:34:21.951 182729 DEBUG nova.compute.manager [None req-dc435165-e2e5-452a-ae35-8eefa3b5bebe - - - - - -] [instance: f5751fc4-5a6d-45f6-ba5b-f552cdb7cb75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:22 compute-0 nova_compute[182725]: 2026-01-22 22:34:22.239 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:22 compute-0 nova_compute[182725]: 2026-01-22 22:34:22.239 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:34:22 compute-0 nova_compute[182725]: 2026-01-22 22:34:22.239 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:34:22 compute-0 nova_compute[182725]: 2026-01-22 22:34:22.252 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:34:22 compute-0 nova_compute[182725]: 2026-01-22 22:34:22.252 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:22 compute-0 nova_compute[182725]: 2026-01-22 22:34:22.523 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:23 compute-0 podman[225295]: 2026-01-22 22:34:23.150961743 +0000 UTC m=+0.069733177 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Jan 22 22:34:23 compute-0 podman[225294]: 2026-01-22 22:34:23.190901848 +0000 UTC m=+0.117403677 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 22:34:23 compute-0 nova_compute[182725]: 2026-01-22 22:34:23.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.017 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.837 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.837 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.853 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.957 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.957 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.963 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:34:24 compute-0 nova_compute[182725]: 2026-01-22 22:34:24.963 182729 INFO nova.compute.claims [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.100 182729 DEBUG nova.compute.provider_tree [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.116 182729 DEBUG nova.scheduler.client.report [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.143 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.144 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.197 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.198 182729 DEBUG nova.network.neutron [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.216 182729 INFO nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.237 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.344 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.346 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.346 182729 INFO nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Creating image(s)
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.347 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.347 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.348 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.359 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.420 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.421 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.422 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.433 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.493 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.495 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.541 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.543 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.543 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.620 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.622 182729 DEBUG nova.virt.disk.api [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.623 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.656 182729 DEBUG nova.policy [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.694 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.695 182729 DEBUG nova.virt.disk.api [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.695 182729 DEBUG nova.objects.instance [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.719 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.720 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Ensure instance console log exists: /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.720 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.721 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.721 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:34:25 compute-0 nova_compute[182725]: 2026-01-22 22:34:25.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:26 compute-0 nova_compute[182725]: 2026-01-22 22:34:26.512 182729 DEBUG nova.network.neutron [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Successfully created port: afca4e40-9413-4bf5-91f1-2208d0fb0153 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.430 182729 DEBUG nova.network.neutron [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Successfully updated port: afca4e40-9413-4bf5-91f1-2208d0fb0153 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.446 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.447 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.447 182729 DEBUG nova.network.neutron [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.530 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.556 182729 DEBUG nova.compute.manager [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-changed-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.556 182729 DEBUG nova.compute.manager [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Refreshing instance network info cache due to event network-changed-afca4e40-9413-4bf5-91f1-2208d0fb0153. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.557 182729 DEBUG oslo_concurrency.lockutils [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.681 182729 DEBUG nova.network.neutron [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.871 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.907 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:27 compute-0 nova_compute[182725]: 2026-01-22 22:34:27.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:28 compute-0 nova_compute[182725]: 2026-01-22 22:34:28.051 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.019 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.799 182729 DEBUG nova.network.neutron [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.840 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.841 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance network_info: |[{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.841 182729 DEBUG oslo_concurrency.lockutils [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.841 182729 DEBUG nova.network.neutron [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Refreshing network info cache for port afca4e40-9413-4bf5-91f1-2208d0fb0153 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.844 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Start _get_guest_xml network_info=[{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.850 182729 WARNING nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.855 182729 DEBUG nova.virt.libvirt.host [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.856 182729 DEBUG nova.virt.libvirt.host [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.865 182729 DEBUG nova.virt.libvirt.host [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.866 182729 DEBUG nova.virt.libvirt.host [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.867 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.867 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.868 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.868 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.868 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.868 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.869 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.869 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.869 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.869 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.869 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.870 182729 DEBUG nova.virt.hardware [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.873 182729 DEBUG nova.virt.libvirt.vif [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:34:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.873 182729 DEBUG nova.network.os_vif_util [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.874 182729 DEBUG nova.network.os_vif_util [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.874 182729 DEBUG nova.objects.instance [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.891 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <uuid>def087b6-47f2-4914-8d90-1e4426e4da0a</uuid>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <name>instance-00000069</name>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-386387448</nova:name>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:34:29</nova:creationTime>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         <nova:port uuid="afca4e40-9413-4bf5-91f1-2208d0fb0153">
Jan 22 22:34:29 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <system>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <entry name="serial">def087b6-47f2-4914-8d90-1e4426e4da0a</entry>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <entry name="uuid">def087b6-47f2-4914-8d90-1e4426e4da0a</entry>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </system>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <os>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   </os>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <features>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   </features>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:fa:b4:6a"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <target dev="tapafca4e40-94"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/console.log" append="off"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <video>
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </video>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:34:29 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:34:29 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:34:29 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:34:29 compute-0 nova_compute[182725]: </domain>
Jan 22 22:34:29 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.892 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Preparing to wait for external event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.892 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.893 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.893 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.893 182729 DEBUG nova.virt.libvirt.vif [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:34:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.894 182729 DEBUG nova.network.os_vif_util [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.894 182729 DEBUG nova.network.os_vif_util [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.894 182729 DEBUG os_vif [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.895 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.895 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.896 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.899 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.899 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafca4e40-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.900 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapafca4e40-94, col_values=(('external_ids', {'iface-id': 'afca4e40-9413-4bf5-91f1-2208d0fb0153', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:b4:6a', 'vm-uuid': 'def087b6-47f2-4914-8d90-1e4426e4da0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.901 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:29 compute-0 NetworkManager[54954]: <info>  [1769121269.9029] manager: (tapafca4e40-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.904 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.908 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:29 compute-0 nova_compute[182725]: 2026-01-22 22:34:29.909 182729 INFO os_vif [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94')
Jan 22 22:34:30 compute-0 nova_compute[182725]: 2026-01-22 22:34:30.192 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:34:30 compute-0 nova_compute[182725]: 2026-01-22 22:34:30.194 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:34:30 compute-0 nova_compute[182725]: 2026-01-22 22:34:30.194 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No VIF found with MAC fa:16:3e:fa:b4:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:34:30 compute-0 nova_compute[182725]: 2026-01-22 22:34:30.195 182729 INFO nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Using config drive
Jan 22 22:34:30 compute-0 nova_compute[182725]: 2026-01-22 22:34:30.767 182729 INFO nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Creating config drive at /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config
Jan 22 22:34:30 compute-0 nova_compute[182725]: 2026-01-22 22:34:30.775 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5g_9q5c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:30 compute-0 nova_compute[182725]: 2026-01-22 22:34:30.929 182729 DEBUG oslo_concurrency.processutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5g_9q5c" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:31 compute-0 kernel: tapafca4e40-94: entered promiscuous mode
Jan 22 22:34:31 compute-0 NetworkManager[54954]: <info>  [1769121271.0220] manager: (tapafca4e40-94): new Tun device (/org/freedesktop/NetworkManager/Devices/174)
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.021 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:31 compute-0 ovn_controller[94850]: 2026-01-22T22:34:31Z|00368|binding|INFO|Claiming lport afca4e40-9413-4bf5-91f1-2208d0fb0153 for this chassis.
Jan 22 22:34:31 compute-0 ovn_controller[94850]: 2026-01-22T22:34:31Z|00369|binding|INFO|afca4e40-9413-4bf5-91f1-2208d0fb0153: Claiming fa:16:3e:fa:b4:6a 10.100.0.14
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.024 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.036 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:b4:6a 10.100.0.14'], port_security=['fa:16:3e:fa:b4:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'def087b6-47f2-4914-8d90-1e4426e4da0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=afca4e40-9413-4bf5-91f1-2208d0fb0153) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.037 104215 INFO neutron.agent.ovn.metadata.agent [-] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.038 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:34:31 compute-0 systemd-udevd[225374]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.056 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c9bbea29-db3f-4621-938f-026607a44225]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.057 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.059 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.059 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd1f88f-bcab-4e71-8189-f19487d38104]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.061 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[10d9800d-a382-495f-9d56-17341109a28a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 systemd-machined[154006]: New machine qemu-43-instance-00000069.
Jan 22 22:34:31 compute-0 NetworkManager[54954]: <info>  [1769121271.0700] device (tapafca4e40-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:34:31 compute-0 NetworkManager[54954]: <info>  [1769121271.0710] device (tapafca4e40-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.074 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[7f743bac-1cd0-485f-a30c-762666f77998]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.081 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:31 compute-0 ovn_controller[94850]: 2026-01-22T22:34:31Z|00370|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 ovn-installed in OVS
Jan 22 22:34:31 compute-0 ovn_controller[94850]: 2026-01-22T22:34:31Z|00371|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 up in Southbound
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.089 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:31 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-00000069.
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.101 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b87f1860-e573-4dd9-9620-783cda1f4f66]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.131 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[0195f5aa-c61a-4833-83b2-21445ad055ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.137 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c48b7921-df17-4044-83c5-f6fe81f5d52b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 NetworkManager[54954]: <info>  [1769121271.1379] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/175)
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.176 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7f456afb-4d36-44ed-9080-11420a9ee96f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.180 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b35738b3-dbae-4cc6-862c-626452833930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 NetworkManager[54954]: <info>  [1769121271.2097] device (tape65877e5-00): carrier: link connected
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.220 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6f626f23-7027-4c04-89b0-9457aa6692fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.245 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c6133131-ce84-41fe-b7d1-0b29df04bdc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490881, 'reachable_time': 26738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225408, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.266 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94189244-7b64-4905-b911-5ba4205fcd40]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490881, 'tstamp': 490881}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225409, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.296 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd1ea83-6fd3-405b-b743-18dad4b81c81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490881, 'reachable_time': 26738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225410, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.341 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[111a2ba9-07f0-43b8-a3bb-9b1460c592e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.418 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8b436efd-561b-40c6-83f2-0a3d279cfc8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.420 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.421 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.422 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.424 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:31 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:34:31 compute-0 NetworkManager[54954]: <info>  [1769121271.4256] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.433 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.434 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:31 compute-0 ovn_controller[94850]: 2026-01-22T22:34:31Z|00372|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.439 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.440 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[be349634-5580-40dc-8519-9c777eb4015c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.441 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:34:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:31.443 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.449 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.852 182729 DEBUG nova.compute.manager [req-496b6e90-86d3-480b-8f2e-e04d706d451a req-2270e90b-2026-4d3f-a329-66f7e21224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.854 182729 DEBUG oslo_concurrency.lockutils [req-496b6e90-86d3-480b-8f2e-e04d706d451a req-2270e90b-2026-4d3f-a329-66f7e21224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.855 182729 DEBUG oslo_concurrency.lockutils [req-496b6e90-86d3-480b-8f2e-e04d706d451a req-2270e90b-2026-4d3f-a329-66f7e21224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.856 182729 DEBUG oslo_concurrency.lockutils [req-496b6e90-86d3-480b-8f2e-e04d706d451a req-2270e90b-2026-4d3f-a329-66f7e21224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.857 182729 DEBUG nova.compute.manager [req-496b6e90-86d3-480b-8f2e-e04d706d451a req-2270e90b-2026-4d3f-a329-66f7e21224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Processing event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:34:31 compute-0 podman[225442]: 2026-01-22 22:34:31.879563522 +0000 UTC m=+0.061627622 container create 1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:34:31 compute-0 systemd[1]: Started libpod-conmon-1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf.scope.
Jan 22 22:34:31 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:34:31 compute-0 podman[225442]: 2026-01-22 22:34:31.852262585 +0000 UTC m=+0.034326735 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:34:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0df098f2126e9d0dba755a39d75fd5879c5b55065ef7ccb005c16447ca724e2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.956 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.958 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121271.9570677, def087b6-47f2-4914-8d90-1e4426e4da0a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.958 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Started (Lifecycle Event)
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.964 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:34:31 compute-0 podman[225442]: 2026-01-22 22:34:31.965282421 +0000 UTC m=+0.147346521 container init 1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.969 182729 INFO nova.virt.libvirt.driver [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance spawned successfully.
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.970 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:34:31 compute-0 podman[225442]: 2026-01-22 22:34:31.975502328 +0000 UTC m=+0.157566428 container start 1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:34:31 compute-0 nova_compute[182725]: 2026-01-22 22:34:31.999 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.005 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.009 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.009 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.010 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.010 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [NOTICE]   (225468) : New worker (225470) forked
Jan 22 22:34:32 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [NOTICE]   (225468) : Loading success.
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.012 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.012 182729 DEBUG nova.virt.libvirt.driver [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.025 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.026 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121271.9572196, def087b6-47f2-4914-8d90-1e4426e4da0a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.026 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Paused (Lifecycle Event)
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.045 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.048 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121271.9616084, def087b6-47f2-4914-8d90-1e4426e4da0a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.048 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Resumed (Lifecycle Event)
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.072 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.075 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.108 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.117 182729 INFO nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Took 6.77 seconds to spawn the instance on the hypervisor.
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.118 182729 DEBUG nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.197 182729 DEBUG nova.network.neutron [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updated VIF entry in instance network info cache for port afca4e40-9413-4bf5-91f1-2208d0fb0153. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.198 182729 DEBUG nova.network.neutron [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.213 182729 DEBUG oslo_concurrency.lockutils [req-f3dec88a-c8eb-4bec-bd48-625a1f32dc1b req-7103157c-695b-4e4a-a9ff-b3fddda989cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.224 182729 INFO nova.compute.manager [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Took 7.31 seconds to build instance.
Jan 22 22:34:32 compute-0 nova_compute[182725]: 2026-01-22 22:34:32.245 182729 DEBUG oslo_concurrency.lockutils [None req-03507a88-fca2-4fb4-b1af-011ae34b29b9 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:33 compute-0 podman[225480]: 2026-01-22 22:34:33.125629538 +0000 UTC m=+0.056960365 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:34:33 compute-0 podman[225479]: 2026-01-22 22:34:33.13800048 +0000 UTC m=+0.069993034 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.018 182729 DEBUG nova.compute.manager [req-cae5a7e4-d95b-489d-bb69-96f7833c2d1d req-45094f7f-b59f-4a98-a689-8163f95cd785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.019 182729 DEBUG oslo_concurrency.lockutils [req-cae5a7e4-d95b-489d-bb69-96f7833c2d1d req-45094f7f-b59f-4a98-a689-8163f95cd785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.019 182729 DEBUG oslo_concurrency.lockutils [req-cae5a7e4-d95b-489d-bb69-96f7833c2d1d req-45094f7f-b59f-4a98-a689-8163f95cd785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.019 182729 DEBUG oslo_concurrency.lockutils [req-cae5a7e4-d95b-489d-bb69-96f7833c2d1d req-45094f7f-b59f-4a98-a689-8163f95cd785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.019 182729 DEBUG nova.compute.manager [req-cae5a7e4-d95b-489d-bb69-96f7833c2d1d req-45094f7f-b59f-4a98-a689-8163f95cd785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.019 182729 WARNING nova.compute.manager [req-cae5a7e4-d95b-489d-bb69-96f7833c2d1d req-45094f7f-b59f-4a98-a689-8163f95cd785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state active and task_state None.
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.021 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:34 compute-0 nova_compute[182725]: 2026-01-22 22:34:34.903 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:35 compute-0 podman[225522]: 2026-01-22 22:34:35.166223022 +0000 UTC m=+0.083762110 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:34:35 compute-0 NetworkManager[54954]: <info>  [1769121275.4815] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 22 22:34:35 compute-0 NetworkManager[54954]: <info>  [1769121275.4824] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Jan 22 22:34:35 compute-0 nova_compute[182725]: 2026-01-22 22:34:35.481 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:35 compute-0 nova_compute[182725]: 2026-01-22 22:34:35.584 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:35 compute-0 ovn_controller[94850]: 2026-01-22T22:34:35Z|00373|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:34:35 compute-0 nova_compute[182725]: 2026-01-22 22:34:35.602 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:36 compute-0 nova_compute[182725]: 2026-01-22 22:34:36.180 182729 DEBUG nova.compute.manager [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-changed-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:36 compute-0 nova_compute[182725]: 2026-01-22 22:34:36.180 182729 DEBUG nova.compute.manager [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Refreshing instance network info cache due to event network-changed-afca4e40-9413-4bf5-91f1-2208d0fb0153. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:34:36 compute-0 nova_compute[182725]: 2026-01-22 22:34:36.180 182729 DEBUG oslo_concurrency.lockutils [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:36 compute-0 nova_compute[182725]: 2026-01-22 22:34:36.180 182729 DEBUG oslo_concurrency.lockutils [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:36 compute-0 nova_compute[182725]: 2026-01-22 22:34:36.181 182729 DEBUG nova.network.neutron [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Refreshing network info cache for port afca4e40-9413-4bf5-91f1-2208d0fb0153 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:34:37 compute-0 nova_compute[182725]: 2026-01-22 22:34:37.974 182729 DEBUG nova.network.neutron [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updated VIF entry in instance network info cache for port afca4e40-9413-4bf5-91f1-2208d0fb0153. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:34:37 compute-0 nova_compute[182725]: 2026-01-22 22:34:37.976 182729 DEBUG nova.network.neutron [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:38 compute-0 nova_compute[182725]: 2026-01-22 22:34:38.005 182729 DEBUG oslo_concurrency.lockutils [req-ffb4210d-c4a9-4686-9a11-bb22540d8bf0 req-d58380a5-534c-46fc-8589-d66e4af7c330 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.025 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.670 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.671 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.699 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:34:39 compute-0 ovn_controller[94850]: 2026-01-22T22:34:39Z|00374|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.852 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.853 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.862 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.862 182729 INFO nova.compute.claims [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.897 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:39 compute-0 nova_compute[182725]: 2026-01-22 22:34:39.906 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.020 182729 DEBUG nova.compute.provider_tree [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.034 182729 DEBUG nova.scheduler.client.report [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.056 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.057 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.118 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.119 182729 DEBUG nova.network.neutron [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.137 182729 INFO nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.159 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.284 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.285 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.286 182729 INFO nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Creating image(s)
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.286 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.287 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.287 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.299 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.326 182729 DEBUG nova.policy [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.368 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.369 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.370 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.380 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.461 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.463 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.509 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.511 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.512 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.587 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.588 182729 DEBUG nova.virt.disk.api [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Checking if we can resize image /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.589 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.687 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.689 182729 DEBUG nova.virt.disk.api [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Cannot resize image /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.690 182729 DEBUG nova.objects.instance [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'migration_context' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.743 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.745 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Ensure instance console log exists: /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.746 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.747 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:40 compute-0 nova_compute[182725]: 2026-01-22 22:34:40.747 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:41 compute-0 nova_compute[182725]: 2026-01-22 22:34:41.174 182729 DEBUG nova.network.neutron [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Successfully created port: 03c52ba6-920a-4801-bb3e-c99e3203ab13 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:34:42 compute-0 nova_compute[182725]: 2026-01-22 22:34:42.844 182729 DEBUG nova.network.neutron [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Successfully updated port: 03c52ba6-920a-4801-bb3e-c99e3203ab13 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:34:42 compute-0 nova_compute[182725]: 2026-01-22 22:34:42.861 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:42 compute-0 nova_compute[182725]: 2026-01-22 22:34:42.862 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquired lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:42 compute-0 nova_compute[182725]: 2026-01-22 22:34:42.862 182729 DEBUG nova.network.neutron [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:34:42 compute-0 nova_compute[182725]: 2026-01-22 22:34:42.912 182729 DEBUG nova.compute.manager [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-changed-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:42 compute-0 nova_compute[182725]: 2026-01-22 22:34:42.912 182729 DEBUG nova.compute.manager [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Refreshing instance network info cache due to event network-changed-03c52ba6-920a-4801-bb3e-c99e3203ab13. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:34:42 compute-0 nova_compute[182725]: 2026-01-22 22:34:42.912 182729 DEBUG oslo_concurrency.lockutils [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:43 compute-0 nova_compute[182725]: 2026-01-22 22:34:43.010 182729 DEBUG nova.network.neutron [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:34:43 compute-0 nova_compute[182725]: 2026-01-22 22:34:43.973 182729 DEBUG nova.network.neutron [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Updating instance_info_cache with network_info: [{"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:43 compute-0 nova_compute[182725]: 2026-01-22 22:34:43.991 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Releasing lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:43 compute-0 nova_compute[182725]: 2026-01-22 22:34:43.991 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance network_info: |[{"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:34:43 compute-0 nova_compute[182725]: 2026-01-22 22:34:43.992 182729 DEBUG oslo_concurrency.lockutils [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:43 compute-0 nova_compute[182725]: 2026-01-22 22:34:43.992 182729 DEBUG nova.network.neutron [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Refreshing network info cache for port 03c52ba6-920a-4801-bb3e-c99e3203ab13 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:34:43 compute-0 nova_compute[182725]: 2026-01-22 22:34:43.995 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Start _get_guest_xml network_info=[{"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.001 182729 WARNING nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.008 182729 DEBUG nova.virt.libvirt.host [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.009 182729 DEBUG nova.virt.libvirt.host [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.011 182729 DEBUG nova.virt.libvirt.host [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.012 182729 DEBUG nova.virt.libvirt.host [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.014 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.014 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.015 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.015 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.015 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.015 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.016 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.016 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.016 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.016 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.017 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.017 182729 DEBUG nova.virt.hardware [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.021 182729 DEBUG nova.virt.libvirt.vif [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-721180198',display_name='tempest-ServerStableDeviceRescueTest-server-721180198',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-721180198',id=108,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-76kjeq8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:34:40Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=9dc942b5-8b65-4eb7-a57c-30d0a6221426,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.021 182729 DEBUG nova.network.os_vif_util [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.022 182729 DEBUG nova.network.os_vif_util [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.022 182729 DEBUG nova.objects.instance [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.027 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.035 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <uuid>9dc942b5-8b65-4eb7-a57c-30d0a6221426</uuid>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <name>instance-0000006c</name>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-721180198</nova:name>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:34:44</nova:creationTime>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:user uuid="9d1e26d3056148e692e157703469d77a">tempest-ServerStableDeviceRescueTest-395714292-project-member</nova:user>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:project uuid="9b1f07a8546648baba916fffc53a0b93">tempest-ServerStableDeviceRescueTest-395714292</nova:project>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         <nova:port uuid="03c52ba6-920a-4801-bb3e-c99e3203ab13">
Jan 22 22:34:44 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <system>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <entry name="serial">9dc942b5-8b65-4eb7-a57c-30d0a6221426</entry>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <entry name="uuid">9dc942b5-8b65-4eb7-a57c-30d0a6221426</entry>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </system>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <os>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   </os>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <features>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   </features>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:6e:7f:79"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <target dev="tap03c52ba6-92"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/console.log" append="off"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <video>
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </video>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:34:44 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:34:44 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:34:44 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:34:44 compute-0 nova_compute[182725]: </domain>
Jan 22 22:34:44 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.036 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Preparing to wait for external event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.036 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.036 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.037 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.037 182729 DEBUG nova.virt.libvirt.vif [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-721180198',display_name='tempest-ServerStableDeviceRescueTest-server-721180198',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-721180198',id=108,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-76kjeq8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:34:40Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=9dc942b5-8b65-4eb7-a57c-30d0a6221426,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.038 182729 DEBUG nova.network.os_vif_util [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.038 182729 DEBUG nova.network.os_vif_util [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.039 182729 DEBUG os_vif [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.039 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.039 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.040 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.043 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.043 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03c52ba6-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.044 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03c52ba6-92, col_values=(('external_ids', {'iface-id': '03c52ba6-920a-4801-bb3e-c99e3203ab13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:7f:79', 'vm-uuid': '9dc942b5-8b65-4eb7-a57c-30d0a6221426'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.045 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 NetworkManager[54954]: <info>  [1769121284.0470] manager: (tap03c52ba6-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.047 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.058 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.060 182729 INFO os_vif [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92')
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.110 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.111 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.111 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No VIF found with MAC fa:16:3e:6e:7f:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.111 182729 INFO nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Using config drive
Jan 22 22:34:44 compute-0 ovn_controller[94850]: 2026-01-22T22:34:44Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fa:b4:6a 10.100.0.14
Jan 22 22:34:44 compute-0 ovn_controller[94850]: 2026-01-22T22:34:44Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fa:b4:6a 10.100.0.14
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.666 182729 INFO nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Creating config drive at /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.672 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkajptocl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.808 182729 DEBUG oslo_concurrency.processutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkajptocl" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:44 compute-0 NetworkManager[54954]: <info>  [1769121284.8726] manager: (tap03c52ba6-92): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Jan 22 22:34:44 compute-0 kernel: tap03c52ba6-92: entered promiscuous mode
Jan 22 22:34:44 compute-0 ovn_controller[94850]: 2026-01-22T22:34:44Z|00375|binding|INFO|Claiming lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 for this chassis.
Jan 22 22:34:44 compute-0 ovn_controller[94850]: 2026-01-22T22:34:44Z|00376|binding|INFO|03c52ba6-920a-4801-bb3e-c99e3203ab13: Claiming fa:16:3e:6e:7f:79 10.100.0.5
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.874 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.884 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:7f:79 10.100.0.5'], port_security=['fa:16:3e:6e:7f:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=03c52ba6-920a-4801-bb3e-c99e3203ab13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:34:44 compute-0 ovn_controller[94850]: 2026-01-22T22:34:44Z|00377|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 ovn-installed in OVS
Jan 22 22:34:44 compute-0 ovn_controller[94850]: 2026-01-22T22:34:44Z|00378|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 up in Southbound
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.886 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 03c52ba6-920a-4801-bb3e-c99e3203ab13 in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 bound to our chassis
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.887 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.888 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 nova_compute[182725]: 2026-01-22 22:34:44.890 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:44 compute-0 systemd-udevd[225606]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.904 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ca39e906-3902-4d21-91b6-e50a10c4660e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.905 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2345e3-01 in ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.907 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2345e3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.907 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[728b500e-9872-49ae-9e37-9d042d9f55be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.908 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[534d7b1f-1ab7-4f44-8d43-abc611c0096e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:44 compute-0 systemd-machined[154006]: New machine qemu-44-instance-0000006c.
Jan 22 22:34:44 compute-0 NetworkManager[54954]: <info>  [1769121284.9182] device (tap03c52ba6-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:34:44 compute-0 NetworkManager[54954]: <info>  [1769121284.9190] device (tap03c52ba6-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.923 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[5682f8a3-bd22-4af5-b638-2acad58ed210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:44 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-0000006c.
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.949 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1e32d9-b964-44f0-8baf-8ef567dee0fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.983 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[03183d65-42ec-44bf-bd2d-93c3dd0fa5a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:44 compute-0 NetworkManager[54954]: <info>  [1769121284.9930] manager: (tapad2345e3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Jan 22 22:34:44 compute-0 systemd-udevd[225610]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:34:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:44.992 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9da667ec-49dd-422f-88fe-cc5a7d556f06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.029 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f98ffe65-4d57-4d08-85dc-52e04e2534b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.034 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1980d780-9e55-4409-9af5-081c4bc7f219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 NetworkManager[54954]: <info>  [1769121285.0590] device (tapad2345e3-00): carrier: link connected
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.064 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fba8cbb2-e80d-459a-a371-a0637db188f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.085 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bb687cc6-b23d-4ff9-8be0-ab5a15bbb697]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492266, 'reachable_time': 24384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225639, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.102 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3af7c48f-101e-4714-8fa0-805e8dec5229]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:33c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492266, 'tstamp': 492266}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225640, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.119 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e3c393-f72c-45ae-90f9-47212fa157a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492266, 'reachable_time': 24384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225641, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.150 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d66cc796-7e9f-42f6-b90d-ea1c550e6a92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.165 182729 DEBUG nova.compute.manager [req-d3315fce-236e-4f97-9568-5adc6affa5de req-741e1501-4428-4c3c-81fb-158cf1034fe1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.166 182729 DEBUG oslo_concurrency.lockutils [req-d3315fce-236e-4f97-9568-5adc6affa5de req-741e1501-4428-4c3c-81fb-158cf1034fe1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.166 182729 DEBUG oslo_concurrency.lockutils [req-d3315fce-236e-4f97-9568-5adc6affa5de req-741e1501-4428-4c3c-81fb-158cf1034fe1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.166 182729 DEBUG oslo_concurrency.lockutils [req-d3315fce-236e-4f97-9568-5adc6affa5de req-741e1501-4428-4c3c-81fb-158cf1034fe1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.167 182729 DEBUG nova.compute.manager [req-d3315fce-236e-4f97-9568-5adc6affa5de req-741e1501-4428-4c3c-81fb-158cf1034fe1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Processing event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.229 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d74cac79-b91f-4b91-941e-557247ee6725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.232 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.232 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.233 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2345e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.235 182729 DEBUG nova.network.neutron [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Updated VIF entry in instance network info cache for port 03c52ba6-920a-4801-bb3e-c99e3203ab13. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:34:45 compute-0 kernel: tapad2345e3-00: entered promiscuous mode
Jan 22 22:34:45 compute-0 NetworkManager[54954]: <info>  [1769121285.2362] manager: (tapad2345e3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.236 182729 DEBUG nova.network.neutron [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Updating instance_info_cache with network_info: [{"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.237 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.241 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2345e3-00, col_values=(('external_ids', {'iface-id': 'bd160f04-1c71-4851-91cb-64d88f335d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.242 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:45 compute-0 ovn_controller[94850]: 2026-01-22T22:34:45Z|00379|binding|INFO|Releasing lport bd160f04-1c71-4851-91cb-64d88f335d22 from this chassis (sb_readonly=0)
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.243 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.244 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cda6cc9d-9db7-4213-a96b-4a402cdc8cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.245 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:34:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:45.246 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'env', 'PROCESS_TAG=haproxy-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2345e3-0b74-4aee-aa42-da6620725bb2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.258 182729 DEBUG oslo_concurrency.lockutils [req-0d2cd04a-089c-4043-8fd9-977901217417 req-2de293f6-d5c4-4898-84e9-60bc1ebae352 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.498 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121285.4977245, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.498 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Started (Lifecycle Event)
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.500 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.518 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.519 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.523 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.525 182729 INFO nova.virt.libvirt.driver [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance spawned successfully.
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.526 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.541 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.542 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121285.4986992, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.542 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Paused (Lifecycle Event)
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.549 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.550 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.550 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.551 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.551 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.552 182729 DEBUG nova.virt.libvirt.driver [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.557 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.560 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121285.5046575, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.560 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Resumed (Lifecycle Event)
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.580 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.585 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.609 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.618 182729 INFO nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Took 5.33 seconds to spawn the instance on the hypervisor.
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.620 182729 DEBUG nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:45 compute-0 podman[225680]: 2026-01-22 22:34:45.631503463 +0000 UTC m=+0.056362430 container create afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:34:45 compute-0 systemd[1]: Started libpod-conmon-afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5.scope.
Jan 22 22:34:45 compute-0 podman[225680]: 2026-01-22 22:34:45.599474777 +0000 UTC m=+0.024333764 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.698 182729 INFO nova.compute.manager [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Took 5.91 seconds to build instance.
Jan 22 22:34:45 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:34:45 compute-0 nova_compute[182725]: 2026-01-22 22:34:45.715 182729 DEBUG oslo_concurrency.lockutils [None req-d2fca8fa-3021-4d61-8c72-4a125acf81a0 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31b14f982c8a28a75cd26ee29c2205a9443fa5c79ef02e11c8a1db57f4949caf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:34:45 compute-0 podman[225680]: 2026-01-22 22:34:45.734295631 +0000 UTC m=+0.159154608 container init afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:34:45 compute-0 podman[225680]: 2026-01-22 22:34:45.745065323 +0000 UTC m=+0.169924280 container start afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 22:34:45 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[225695]: [NOTICE]   (225699) : New worker (225701) forked
Jan 22 22:34:45 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[225695]: [NOTICE]   (225699) : Loading success.
Jan 22 22:34:46 compute-0 nova_compute[182725]: 2026-01-22 22:34:46.199 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.242 182729 DEBUG nova.compute.manager [req-f814c8ff-e98f-4fdf-b9b5-3c9373d2e556 req-c82e63f3-93c2-47c8-9a75-c7213e90dd34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.243 182729 DEBUG oslo_concurrency.lockutils [req-f814c8ff-e98f-4fdf-b9b5-3c9373d2e556 req-c82e63f3-93c2-47c8-9a75-c7213e90dd34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.244 182729 DEBUG oslo_concurrency.lockutils [req-f814c8ff-e98f-4fdf-b9b5-3c9373d2e556 req-c82e63f3-93c2-47c8-9a75-c7213e90dd34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.244 182729 DEBUG oslo_concurrency.lockutils [req-f814c8ff-e98f-4fdf-b9b5-3c9373d2e556 req-c82e63f3-93c2-47c8-9a75-c7213e90dd34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.244 182729 DEBUG nova.compute.manager [req-f814c8ff-e98f-4fdf-b9b5-3c9373d2e556 req-c82e63f3-93c2-47c8-9a75-c7213e90dd34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.244 182729 WARNING nova.compute.manager [req-f814c8ff-e98f-4fdf-b9b5-3c9373d2e556 req-c82e63f3-93c2-47c8-9a75-c7213e90dd34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state active and task_state None.
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.667 182729 DEBUG nova.compute.manager [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:34:47 compute-0 nova_compute[182725]: 2026-01-22 22:34:47.754 182729 INFO nova.compute.manager [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] instance snapshotting
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.026 182729 INFO nova.virt.libvirt.driver [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Beginning live snapshot process
Jan 22 22:34:48 compute-0 virtqemud[182297]: invalid argument: disk vda does not have an active block job
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.243 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.309 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.311 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.378 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json -f qcow2" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.398 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.462 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.464 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2mwsobfk/8fe8bbb80a9a4985a04e7b432c3df4a1.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.508 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2mwsobfk/8fe8bbb80a9a4985a04e7b432c3df4a1.delta 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.510 182729 INFO nova.virt.libvirt.driver [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.576 182729 DEBUG nova.virt.libvirt.guest [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.583 182729 INFO nova.virt.libvirt.driver [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 22:34:48 compute-0 podman[225722]: 2026-01-22 22:34:48.615346077 +0000 UTC m=+0.071487911 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.642 182729 DEBUG nova.privsep.utils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.643 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2mwsobfk/8fe8bbb80a9a4985a04e7b432c3df4a1.delta /var/lib/nova/instances/snapshots/tmp2mwsobfk/8fe8bbb80a9a4985a04e7b432c3df4a1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.863 182729 DEBUG oslo_concurrency.processutils [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2mwsobfk/8fe8bbb80a9a4985a04e7b432c3df4a1.delta /var/lib/nova/instances/snapshots/tmp2mwsobfk/8fe8bbb80a9a4985a04e7b432c3df4a1" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:48 compute-0 nova_compute[182725]: 2026-01-22 22:34:48.864 182729 INFO nova.virt.libvirt.driver [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Snapshot extracted, beginning image upload
Jan 22 22:34:49 compute-0 nova_compute[182725]: 2026-01-22 22:34:49.032 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:49 compute-0 nova_compute[182725]: 2026-01-22 22:34:49.045 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:49 compute-0 nova_compute[182725]: 2026-01-22 22:34:49.678 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:50 compute-0 nova_compute[182725]: 2026-01-22 22:34:50.778 182729 INFO nova.virt.libvirt.driver [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Snapshot image upload complete
Jan 22 22:34:50 compute-0 nova_compute[182725]: 2026-01-22 22:34:50.779 182729 INFO nova.compute.manager [None req-77b2863a-f0c2-42d3-8b1b-7c509a147037 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Took 3.01 seconds to snapshot the instance on the hypervisor.
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.185 182729 INFO nova.compute.manager [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Rescuing
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.187 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.187 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquired lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.188 182729 DEBUG nova.network.neutron [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.916 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:34:53 compute-0 nova_compute[182725]: 2026-01-22 22:34:53.946 182729 DEBUG nova.compute.manager [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.035 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.046 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.048 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.048 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.111 182729 DEBUG nova.objects.instance [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_requests' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.135 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.135 182729 INFO nova.compute.claims [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.136 182729 DEBUG nova.objects.instance [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.148 182729 DEBUG nova.objects.instance [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:34:54 compute-0 podman[225759]: 2026-01-22 22:34:54.162072077 +0000 UTC m=+0.089392282 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7)
Jan 22 22:34:54 compute-0 podman[225758]: 2026-01-22 22:34:54.16776271 +0000 UTC m=+0.096906361 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.192 182729 INFO nova.compute.resource_tracker [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating resource usage from migration adef7b22-ad57-4851-ab9d-b816513343b5
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.279 182729 DEBUG nova.compute.provider_tree [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.299 182729 DEBUG nova.scheduler.client.report [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.324 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.324 182729 INFO nova.compute.manager [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Migrating
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.367 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.368 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:34:54 compute-0 nova_compute[182725]: 2026-01-22 22:34:54.368 182729 DEBUG nova.network.neutron [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:34:55 compute-0 nova_compute[182725]: 2026-01-22 22:34:55.083 182729 DEBUG nova.network.neutron [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Updating instance_info_cache with network_info: [{"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:55 compute-0 nova_compute[182725]: 2026-01-22 22:34:55.115 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Releasing lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:55 compute-0 nova_compute[182725]: 2026-01-22 22:34:55.317 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:34:55 compute-0 nova_compute[182725]: 2026-01-22 22:34:55.648 182729 DEBUG nova.network.neutron [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:34:55 compute-0 nova_compute[182725]: 2026-01-22 22:34:55.663 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:34:55 compute-0 nova_compute[182725]: 2026-01-22 22:34:55.789 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 22:34:55 compute-0 nova_compute[182725]: 2026-01-22 22:34:55.795 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:34:57 compute-0 ovn_controller[94850]: 2026-01-22T22:34:57Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6e:7f:79 10.100.0.5
Jan 22 22:34:57 compute-0 ovn_controller[94850]: 2026-01-22T22:34:57Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:7f:79 10.100.0.5
Jan 22 22:34:57 compute-0 kernel: tapafca4e40-94 (unregistering): left promiscuous mode
Jan 22 22:34:58 compute-0 NetworkManager[54954]: <info>  [1769121298.0027] device (tapafca4e40-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:34:58 compute-0 ovn_controller[94850]: 2026-01-22T22:34:58Z|00380|binding|INFO|Releasing lport afca4e40-9413-4bf5-91f1-2208d0fb0153 from this chassis (sb_readonly=0)
Jan 22 22:34:58 compute-0 ovn_controller[94850]: 2026-01-22T22:34:58Z|00381|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 down in Southbound
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.011 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 ovn_controller[94850]: 2026-01-22T22:34:58Z|00382|binding|INFO|Removing iface tapafca4e40-94 ovn-installed in OVS
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.013 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.023 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:b4:6a 10.100.0.14'], port_security=['fa:16:3e:fa:b4:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'def087b6-47f2-4914-8d90-1e4426e4da0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=afca4e40-9413-4bf5-91f1-2208d0fb0153) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.025 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.025 104215 INFO neutron.agent.ovn.metadata.agent [-] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.027 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.028 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c1374096-5737-4f67-a849-7a51e1710816]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.028 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:34:58 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000069.scope: Deactivated successfully.
Jan 22 22:34:58 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000069.scope: Consumed 13.678s CPU time.
Jan 22 22:34:58 compute-0 systemd-machined[154006]: Machine qemu-43-instance-00000069 terminated.
Jan 22 22:34:58 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [NOTICE]   (225468) : haproxy version is 2.8.14-c23fe91
Jan 22 22:34:58 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [NOTICE]   (225468) : path to executable is /usr/sbin/haproxy
Jan 22 22:34:58 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [WARNING]  (225468) : Exiting Master process...
Jan 22 22:34:58 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [WARNING]  (225468) : Exiting Master process...
Jan 22 22:34:58 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [ALERT]    (225468) : Current worker (225470) exited with code 143 (Terminated)
Jan 22 22:34:58 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[225463]: [WARNING]  (225468) : All workers exited. Exiting... (0)
Jan 22 22:34:58 compute-0 systemd[1]: libpod-1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf.scope: Deactivated successfully.
Jan 22 22:34:58 compute-0 podman[225846]: 2026-01-22 22:34:58.163015512 +0000 UTC m=+0.045669111 container died 1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:34:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf-userdata-shm.mount: Deactivated successfully.
Jan 22 22:34:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-0df098f2126e9d0dba755a39d75fd5879c5b55065ef7ccb005c16447ca724e2c-merged.mount: Deactivated successfully.
Jan 22 22:34:58 compute-0 podman[225846]: 2026-01-22 22:34:58.200114606 +0000 UTC m=+0.082768185 container cleanup 1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 22:34:58 compute-0 systemd[1]: libpod-conmon-1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf.scope: Deactivated successfully.
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.232 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.237 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 podman[225876]: 2026-01-22 22:34:58.27175496 +0000 UTC m=+0.049716053 container remove 1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.278 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[689b76ec-6699-455d-aa80-b25d23fd4beb]: (4, ('Thu Jan 22 10:34:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf)\n1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf\nThu Jan 22 10:34:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf)\n1123c91a4fb02d54b4f94cb4e1b2c0e660d01066cf3db3fe1f09f3f0259eefbf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.280 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4e110d59-1697-4365-a08a-608817447a2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.281 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.282 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.299 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.300 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.301 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[30fcc781-2d69-45e6-8ac5-4f4d8de2e24a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.318 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[89aa408e-6a3a-4938-9d0c-3f64b8317791]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.320 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e791102f-32b5-4e1b-8472-fb005cae63d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.337 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[09d0ecf5-8d64-468c-b37b-886db48a7434]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490873, 'reachable_time': 44570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225910, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.340 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:34:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:34:58.340 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[ca138c49-6364-41a1-9bd3-b7831357fe8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:34:58 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.816 182729 INFO nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance shutdown successfully after 3 seconds.
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.822 182729 INFO nova.virt.libvirt.driver [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance destroyed successfully.
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.823 182729 DEBUG nova.virt.libvirt.vif [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:34:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:fa:b4:6a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.823 182729 DEBUG nova.network.os_vif_util [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:fa:b4:6a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.824 182729 DEBUG nova.network.os_vif_util [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.824 182729 DEBUG os_vif [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.826 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.826 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafca4e40-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.828 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.829 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.833 182729 INFO os_vif [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94')
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.838 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.918 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:58 compute-0 nova_compute[182725]: 2026-01-22 22:34:58.920 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.021 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.024 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_resize/disk /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.052 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.067 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "cp -r /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_resize/disk /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.069 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_resize/disk.config /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.106 182729 DEBUG nova.compute.manager [req-e5368d18-dd98-49b2-b08e-03ba67a7167b req-3b356974-e3b6-44c3-b12c-2a3834ca30ef 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-unplugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.107 182729 DEBUG oslo_concurrency.lockutils [req-e5368d18-dd98-49b2-b08e-03ba67a7167b req-3b356974-e3b6-44c3-b12c-2a3834ca30ef 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.107 182729 DEBUG oslo_concurrency.lockutils [req-e5368d18-dd98-49b2-b08e-03ba67a7167b req-3b356974-e3b6-44c3-b12c-2a3834ca30ef 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.107 182729 DEBUG oslo_concurrency.lockutils [req-e5368d18-dd98-49b2-b08e-03ba67a7167b req-3b356974-e3b6-44c3-b12c-2a3834ca30ef 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.108 182729 DEBUG nova.compute.manager [req-e5368d18-dd98-49b2-b08e-03ba67a7167b req-3b356974-e3b6-44c3-b12c-2a3834ca30ef 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-unplugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.108 182729 WARNING nova.compute.manager [req-e5368d18-dd98-49b2-b08e-03ba67a7167b req-3b356974-e3b6-44c3-b12c-2a3834ca30ef 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-unplugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state active and task_state resize_migrating.
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.109 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "cp -r /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_resize/disk.config /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.110 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_resize/disk.info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.143 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "cp -r /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_resize/disk.info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.703 182729 DEBUG nova.network.neutron [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.879 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.880 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:34:59 compute-0 nova_compute[182725]: 2026-01-22 22:34:59.880 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:00 compute-0 nova_compute[182725]: 2026-01-22 22:35:00.656 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:00 compute-0 nova_compute[182725]: 2026-01-22 22:35:00.656 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:00 compute-0 nova_compute[182725]: 2026-01-22 22:35:00.657 182729 DEBUG nova.network.neutron [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:35:02 compute-0 nova_compute[182725]: 2026-01-22 22:35:02.006 182729 DEBUG nova.compute.manager [req-5390b87d-cc25-441d-afbd-d456d9d48751 req-c7d0b2c8-7036-4be9-8f59-6adb7b97502f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:02 compute-0 nova_compute[182725]: 2026-01-22 22:35:02.007 182729 DEBUG oslo_concurrency.lockutils [req-5390b87d-cc25-441d-afbd-d456d9d48751 req-c7d0b2c8-7036-4be9-8f59-6adb7b97502f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:02 compute-0 nova_compute[182725]: 2026-01-22 22:35:02.008 182729 DEBUG oslo_concurrency.lockutils [req-5390b87d-cc25-441d-afbd-d456d9d48751 req-c7d0b2c8-7036-4be9-8f59-6adb7b97502f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:02 compute-0 nova_compute[182725]: 2026-01-22 22:35:02.008 182729 DEBUG oslo_concurrency.lockutils [req-5390b87d-cc25-441d-afbd-d456d9d48751 req-c7d0b2c8-7036-4be9-8f59-6adb7b97502f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:02 compute-0 nova_compute[182725]: 2026-01-22 22:35:02.009 182729 DEBUG nova.compute.manager [req-5390b87d-cc25-441d-afbd-d456d9d48751 req-c7d0b2c8-7036-4be9-8f59-6adb7b97502f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:02 compute-0 nova_compute[182725]: 2026-01-22 22:35:02.009 182729 WARNING nova.compute.manager [req-5390b87d-cc25-441d-afbd-d456d9d48751 req-c7d0b2c8-7036-4be9-8f59-6adb7b97502f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state active and task_state resize_migrated.
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.688 182729 DEBUG nova.network.neutron [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.745 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.829 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.904 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.907 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.908 182729 INFO nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Creating image(s)
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.909 182729 DEBUG nova.objects.instance [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'trusted_certs' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:03 compute-0 nova_compute[182725]: 2026-01-22 22:35:03.935 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.000 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.002 182729 DEBUG nova.virt.disk.api [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.003 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.041 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.094 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.095 182729 DEBUG nova.virt.disk.api [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.119 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.120 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Ensure instance console log exists: /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.121 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.122 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.122 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.127 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Start _get_guest_xml network_info=[{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:fa:b4:6a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:35:04 compute-0 podman[225925]: 2026-01-22 22:35:04.13619943 +0000 UTC m=+0.062870544 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.138 182729 WARNING nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:35:04 compute-0 podman[225926]: 2026-01-22 22:35:04.166302308 +0000 UTC m=+0.097365052 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.174 182729 DEBUG nova.virt.libvirt.host [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.176 182729 DEBUG nova.virt.libvirt.host [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.180 182729 DEBUG nova.virt.libvirt.host [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.181 182729 DEBUG nova.virt.libvirt.host [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.183 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.183 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='617fb2f8-2c15-4939-a64a-90fca4acd12a',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.184 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.184 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.184 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.185 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.185 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.185 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.185 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.186 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.186 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.186 182729 DEBUG nova.virt.hardware [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.186 182729 DEBUG nova.objects.instance [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'vcpu_model' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.210 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.302 182729 DEBUG oslo_concurrency.processutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.304 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.304 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.305 182729 DEBUG oslo_concurrency.lockutils [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.307 182729 DEBUG nova.virt.libvirt.vif [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:34:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:fa:b4:6a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.307 182729 DEBUG nova.network.os_vif_util [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:fa:b4:6a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.308 182729 DEBUG nova.network.os_vif_util [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.311 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <uuid>def087b6-47f2-4914-8d90-1e4426e4da0a</uuid>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <name>instance-00000069</name>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <memory>196608</memory>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-386387448</nova:name>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:35:04</nova:creationTime>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <nova:flavor name="m1.micro">
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:memory>192</nova:memory>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         <nova:port uuid="afca4e40-9413-4bf5-91f1-2208d0fb0153">
Jan 22 22:35:04 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <system>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <entry name="serial">def087b6-47f2-4914-8d90-1e4426e4da0a</entry>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <entry name="uuid">def087b6-47f2-4914-8d90-1e4426e4da0a</entry>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </system>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <os>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   </os>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <features>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   </features>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:fa:b4:6a"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <target dev="tapafca4e40-94"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/console.log" append="off"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <video>
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </video>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:35:04 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:35:04 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:35:04 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:35:04 compute-0 nova_compute[182725]: </domain>
Jan 22 22:35:04 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.312 182729 DEBUG nova.virt.libvirt.vif [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:34:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:fa:b4:6a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.312 182729 DEBUG nova.network.os_vif_util [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-546014484-network", "vif_mac": "fa:16:3e:fa:b4:6a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.313 182729 DEBUG nova.network.os_vif_util [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.313 182729 DEBUG os_vif [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.314 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.315 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.315 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.318 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.319 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafca4e40-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.319 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapafca4e40-94, col_values=(('external_ids', {'iface-id': 'afca4e40-9413-4bf5-91f1-2208d0fb0153', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:b4:6a', 'vm-uuid': 'def087b6-47f2-4914-8d90-1e4426e4da0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.322 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 NetworkManager[54954]: <info>  [1769121304.3245] manager: (tapafca4e40-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.324 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.330 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.330 182729 INFO os_vif [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94')
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.409 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.410 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.411 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No VIF found with MAC fa:16:3e:fa:b4:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.412 182729 INFO nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Using config drive
Jan 22 22:35:04 compute-0 kernel: tapafca4e40-94: entered promiscuous mode
Jan 22 22:35:04 compute-0 ovn_controller[94850]: 2026-01-22T22:35:04Z|00383|binding|INFO|Claiming lport afca4e40-9413-4bf5-91f1-2208d0fb0153 for this chassis.
Jan 22 22:35:04 compute-0 ovn_controller[94850]: 2026-01-22T22:35:04Z|00384|binding|INFO|afca4e40-9413-4bf5-91f1-2208d0fb0153: Claiming fa:16:3e:fa:b4:6a 10.100.0.14
Jan 22 22:35:04 compute-0 NetworkManager[54954]: <info>  [1769121304.4997] manager: (tapafca4e40-94): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.500 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.518 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:b4:6a 10.100.0.14'], port_security=['fa:16:3e:fa:b4:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'def087b6-47f2-4914-8d90-1e4426e4da0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=afca4e40-9413-4bf5-91f1-2208d0fb0153) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.520 104215 INFO neutron.agent.ovn.metadata.agent [-] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:35:04 compute-0 ovn_controller[94850]: 2026-01-22T22:35:04Z|00385|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 ovn-installed in OVS
Jan 22 22:35:04 compute-0 ovn_controller[94850]: 2026-01-22T22:35:04Z|00386|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 up in Southbound
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.522 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.524 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.526 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.544 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5db3e139-df7b-4ab0-baa3-767748a31997]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.545 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:35:04 compute-0 systemd-udevd[225986]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.548 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.549 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ac50e1e0-d9a5-4da2-a557-ea8cbab8857c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.550 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[225ab023-1ee9-42da-b838-71125225cf07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 NetworkManager[54954]: <info>  [1769121304.5634] device (tapafca4e40-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:35:04 compute-0 NetworkManager[54954]: <info>  [1769121304.5643] device (tapafca4e40-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.568 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[353e5f7f-7295-446f-9d9d-0351b044d2c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 systemd-machined[154006]: New machine qemu-45-instance-00000069.
Jan 22 22:35:04 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-00000069.
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.593 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4fcfb7d6-57f6-42e0-a48c-bc3bda79b90b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.625 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.626 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.635 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[571cc300-fdb5-437e-bdb0-ccfe002a7bb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.643 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[390c5dda-c2a7-4485-a717-e75caef36304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 NetworkManager[54954]: <info>  [1769121304.6445] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.704 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[90d6a0d1-6b95-4c7e-87ad-71e7de2b966c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.713 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d6631d-eb77-4aa0-86da-e20d1b71eb4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 NetworkManager[54954]: <info>  [1769121304.7404] device (tape65877e5-00): carrier: link connected
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.746 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d2bdfd83-fd6d-4b8f-b110-ccca99dddf22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.770 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[961aae46-a2f3-4c95-a724-189d1f7449b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494234, 'reachable_time': 42964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226020, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.790 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d4f9f9-7dda-4335-921a-24272a7522f6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494234, 'tstamp': 494234}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226021, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.811 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e62a1f8b-e1ad-4cd2-a630-7f32f2eb9321]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494234, 'reachable_time': 42964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226022, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.853 182729 DEBUG nova.compute.manager [req-f2fa1739-8da2-4f93-9350-afbf6fcdef15 req-1c07cce3-1c63-4898-bd00-c5dd6c8fac1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.853 182729 DEBUG oslo_concurrency.lockutils [req-f2fa1739-8da2-4f93-9350-afbf6fcdef15 req-1c07cce3-1c63-4898-bd00-c5dd6c8fac1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.854 182729 DEBUG oslo_concurrency.lockutils [req-f2fa1739-8da2-4f93-9350-afbf6fcdef15 req-1c07cce3-1c63-4898-bd00-c5dd6c8fac1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.854 182729 DEBUG oslo_concurrency.lockutils [req-f2fa1739-8da2-4f93-9350-afbf6fcdef15 req-1c07cce3-1c63-4898-bd00-c5dd6c8fac1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.854 182729 DEBUG nova.compute.manager [req-f2fa1739-8da2-4f93-9350-afbf6fcdef15 req-1c07cce3-1c63-4898-bd00-c5dd6c8fac1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.854 182729 WARNING nova.compute.manager [req-f2fa1739-8da2-4f93-9350-afbf6fcdef15 req-1c07cce3-1c63-4898-bd00-c5dd6c8fac1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state active and task_state resize_finish.
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.855 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa7eead-97f0-4a16-9c24-250a3f75bc9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.943 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[41ac5c6e-bf75-4526-806f-349a51a0e0f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.945 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.945 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.945 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:04 compute-0 NetworkManager[54954]: <info>  [1769121304.9497] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Jan 22 22:35:04 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.953 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:04 compute-0 ovn_controller[94850]: 2026-01-22T22:35:04Z|00387|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.973 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.983 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 nova_compute[182725]: 2026-01-22 22:35:04.984 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.987 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.989 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[05822a4d-b10f-4cf3-b1fe-c8c37d132bbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.990 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:35:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:04.991 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.379 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 22:35:05 compute-0 podman[226054]: 2026-01-22 22:35:05.459024519 +0000 UTC m=+0.067600313 container create e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:35:05 compute-0 systemd[1]: Started libpod-conmon-e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857.scope.
Jan 22 22:35:05 compute-0 podman[226054]: 2026-01-22 22:35:05.431082155 +0000 UTC m=+0.039657989 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:35:05 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:35:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aca0daf54bfa36cb0dd35b5b767598b04a5f732e1b58a032f63c28ceb5d62c5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:35:05 compute-0 podman[226054]: 2026-01-22 22:35:05.576304442 +0000 UTC m=+0.184880276 container init e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:35:05 compute-0 podman[226054]: 2026-01-22 22:35:05.589594157 +0000 UTC m=+0.198169991 container start e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:35:05 compute-0 podman[226067]: 2026-01-22 22:35:05.597151167 +0000 UTC m=+0.101948438 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:35:05 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [NOTICE]   (226098) : New worker (226100) forked
Jan 22 22:35:05 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [NOTICE]   (226098) : Loading success.
Jan 22 22:35:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:05.689 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.858 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for def087b6-47f2-4914-8d90-1e4426e4da0a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.858 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121305.8573735, def087b6-47f2-4914-8d90-1e4426e4da0a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.858 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Resumed (Lifecycle Event)
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.861 182729 DEBUG nova.compute.manager [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.865 182729 INFO nova.virt.libvirt.driver [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance running successfully.
Jan 22 22:35:05 compute-0 virtqemud[182297]: argument unsupported: QEMU guest agent is not configured
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.868 182729 DEBUG nova.virt.libvirt.guest [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.868 182729 DEBUG nova.virt.libvirt.driver [None req-9c015695-163a-4923-8f0b-5460d20953ab 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.880 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.888 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.937 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.938 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121305.8603132, def087b6-47f2-4914-8d90-1e4426e4da0a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.938 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Started (Lifecycle Event)
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.960 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:05 compute-0 nova_compute[182725]: 2026-01-22 22:35:05.969 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:06.691 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.042 182729 DEBUG nova.compute.manager [req-ff4a96d7-3fab-4d47-83f3-7417cbf0a56a req-9f88a815-5d8b-46fc-8cae-5b69fb4e1db2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.043 182729 DEBUG oslo_concurrency.lockutils [req-ff4a96d7-3fab-4d47-83f3-7417cbf0a56a req-9f88a815-5d8b-46fc-8cae-5b69fb4e1db2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.043 182729 DEBUG oslo_concurrency.lockutils [req-ff4a96d7-3fab-4d47-83f3-7417cbf0a56a req-9f88a815-5d8b-46fc-8cae-5b69fb4e1db2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.043 182729 DEBUG oslo_concurrency.lockutils [req-ff4a96d7-3fab-4d47-83f3-7417cbf0a56a req-9f88a815-5d8b-46fc-8cae-5b69fb4e1db2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.043 182729 DEBUG nova.compute.manager [req-ff4a96d7-3fab-4d47-83f3-7417cbf0a56a req-9f88a815-5d8b-46fc-8cae-5b69fb4e1db2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.043 182729 WARNING nova.compute.manager [req-ff4a96d7-3fab-4d47-83f3-7417cbf0a56a req-9f88a815-5d8b-46fc-8cae-5b69fb4e1db2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state resized and task_state None.
Jan 22 22:35:07 compute-0 kernel: tap03c52ba6-92 (unregistering): left promiscuous mode
Jan 22 22:35:07 compute-0 NetworkManager[54954]: <info>  [1769121307.6242] device (tap03c52ba6-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:35:07 compute-0 ovn_controller[94850]: 2026-01-22T22:35:07Z|00388|binding|INFO|Releasing lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 from this chassis (sb_readonly=0)
Jan 22 22:35:07 compute-0 ovn_controller[94850]: 2026-01-22T22:35:07Z|00389|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 down in Southbound
Jan 22 22:35:07 compute-0 ovn_controller[94850]: 2026-01-22T22:35:07Z|00390|binding|INFO|Removing iface tap03c52ba6-92 ovn-installed in OVS
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.656 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.659 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.670 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:7f:79 10.100.0.5'], port_security=['fa:16:3e:6e:7f:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=03c52ba6-920a-4801-bb3e-c99e3203ab13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.671 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 03c52ba6-920a-4801-bb3e-c99e3203ab13 in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.672 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.680 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4d1e0d-9630-4901-b819-bcbc6c9667a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.681 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace which is not needed anymore
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.692 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Jan 22 22:35:07 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000006c.scope: Consumed 13.200s CPU time.
Jan 22 22:35:07 compute-0 systemd-machined[154006]: Machine qemu-44-instance-0000006c terminated.
Jan 22 22:35:07 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[225695]: [NOTICE]   (225699) : haproxy version is 2.8.14-c23fe91
Jan 22 22:35:07 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[225695]: [NOTICE]   (225699) : path to executable is /usr/sbin/haproxy
Jan 22 22:35:07 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[225695]: [WARNING]  (225699) : Exiting Master process...
Jan 22 22:35:07 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[225695]: [ALERT]    (225699) : Current worker (225701) exited with code 143 (Terminated)
Jan 22 22:35:07 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[225695]: [WARNING]  (225699) : All workers exited. Exiting... (0)
Jan 22 22:35:07 compute-0 systemd[1]: libpod-afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5.scope: Deactivated successfully.
Jan 22 22:35:07 compute-0 podman[226142]: 2026-01-22 22:35:07.848873467 +0000 UTC m=+0.051230561 container died afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.856 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.861 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5-userdata-shm.mount: Deactivated successfully.
Jan 22 22:35:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-31b14f982c8a28a75cd26ee29c2205a9443fa5c79ef02e11c8a1db57f4949caf-merged.mount: Deactivated successfully.
Jan 22 22:35:07 compute-0 podman[226142]: 2026-01-22 22:35:07.896764903 +0000 UTC m=+0.099121977 container cleanup afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:07 compute-0 systemd[1]: libpod-conmon-afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5.scope: Deactivated successfully.
Jan 22 22:35:07 compute-0 podman[226183]: 2026-01-22 22:35:07.961002561 +0000 UTC m=+0.039772373 container remove afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.970 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d744fa-b438-4ff2-888b-57303bc3f26f]: (4, ('Thu Jan 22 10:35:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5)\nafe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5\nThu Jan 22 10:35:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (afe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5)\nafe870973be46b153d0535c4bbe0e46c7303dd003d8ef7a69c8ef92d32784ed5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.972 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9dde5d-76fd-4d9f-8750-a05ecd5af5b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.972 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:07 compute-0 kernel: tapad2345e3-00: left promiscuous mode
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.975 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.992 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 nova_compute[182725]: 2026-01-22 22:35:07.995 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:07.999 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b92da258-c411-4d0a-a896-b07eaf8adcd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:08.023 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[504dbd9f-54a0-4693-ab3e-0367b36ab734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:08.024 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0afed62c-540b-49b5-bbb3-9c413cc48b47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:08.042 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[76ef1e4e-735b-4efd-b111-695348165325]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492257, 'reachable_time': 26878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226203, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:08 compute-0 systemd[1]: run-netns-ovnmeta\x2dad2345e3\x2d0b74\x2d4aee\x2daa42\x2dda6620725bb2.mount: Deactivated successfully.
Jan 22 22:35:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:08.048 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:35:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:08.048 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[44be717f-24cf-49bb-af7a-e69705a6f4e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.400 182729 INFO nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance shutdown successfully after 13 seconds.
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.408 182729 INFO nova.virt.libvirt.driver [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance destroyed successfully.
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.409 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.440 182729 INFO nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Attempting a stable device rescue
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.758 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.766 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.767 182729 INFO nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Creating image(s)
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.768 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.769 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.770 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.771 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.797 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "2de6e157e1fcab398fb030cb85fc31760486f7cc" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.799 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "2de6e157e1fcab398fb030cb85fc31760486f7cc" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.830 182729 DEBUG nova.network.neutron [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.831 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.831 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:08 compute-0 nova_compute[182725]: 2026-01-22 22:35:08.832 182729 DEBUG nova.network.neutron [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.042 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.247 182729 DEBUG nova.compute.manager [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.248 182729 DEBUG oslo_concurrency.lockutils [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.249 182729 DEBUG oslo_concurrency.lockutils [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.249 182729 DEBUG oslo_concurrency.lockutils [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.250 182729 DEBUG nova.compute.manager [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.250 182729 WARNING nova.compute.manager [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state active and task_state rescuing.
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.251 182729 DEBUG nova.compute.manager [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.252 182729 DEBUG oslo_concurrency.lockutils [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.252 182729 DEBUG oslo_concurrency.lockutils [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.253 182729 DEBUG oslo_concurrency.lockutils [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.253 182729 DEBUG nova.compute.manager [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.253 182729 WARNING nova.compute.manager [req-773c3ad6-4b53-4c87-84f9-5ab84754da01 req-ab481d61-b10b-4825-9bfe-6b6f1c76eb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state active and task_state rescuing.
Jan 22 22:35:09 compute-0 nova_compute[182725]: 2026-01-22 22:35:09.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.264 182729 DEBUG nova.network.neutron [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.282 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.298 182729 DEBUG nova.virt.libvirt.driver [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Creating tmpfile /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/tmpp4hk25v6 to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Jan 22 22:35:10 compute-0 kernel: tapafca4e40-94 (unregistering): left promiscuous mode
Jan 22 22:35:10 compute-0 NetworkManager[54954]: <info>  [1769121310.3336] device (tapafca4e40-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.340 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:10 compute-0 ovn_controller[94850]: 2026-01-22T22:35:10Z|00391|binding|INFO|Releasing lport afca4e40-9413-4bf5-91f1-2208d0fb0153 from this chassis (sb_readonly=0)
Jan 22 22:35:10 compute-0 ovn_controller[94850]: 2026-01-22T22:35:10Z|00392|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 down in Southbound
Jan 22 22:35:10 compute-0 ovn_controller[94850]: 2026-01-22T22:35:10Z|00393|binding|INFO|Removing iface tapafca4e40-94 ovn-installed in OVS
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.361 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:b4:6a 10.100.0.14'], port_security=['fa:16:3e:fa:b4:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'def087b6-47f2-4914-8d90-1e4426e4da0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=afca4e40-9413-4bf5-91f1-2208d0fb0153) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.362 104215 INFO neutron.agent.ovn.metadata.agent [-] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.364 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.365 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f773c9a4-aa75-4cc6-b673-fc38539d4b8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.365 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.369 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.377 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Deactivated successfully.
Jan 22 22:35:10 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Consumed 5.847s CPU time.
Jan 22 22:35:10 compute-0 systemd-machined[154006]: Machine qemu-45-instance-00000069 terminated.
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.410 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.part --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.413 182729 DEBUG nova.virt.images [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] 6962d84c-ff3a-4fbc-bfe5-4022aacf7fa1 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.415 182729 DEBUG nova.privsep.utils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.416 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.part /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:10 compute-0 NetworkManager[54954]: <info>  [1769121310.5341] manager: (tapafca4e40-94): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.586 182729 INFO nova.virt.libvirt.driver [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance destroyed successfully.
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.589 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.part /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.converted" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.590 182729 DEBUG nova.objects.instance [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:10 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [NOTICE]   (226098) : haproxy version is 2.8.14-c23fe91
Jan 22 22:35:10 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [NOTICE]   (226098) : path to executable is /usr/sbin/haproxy
Jan 22 22:35:10 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [WARNING]  (226098) : Exiting Master process...
Jan 22 22:35:10 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [WARNING]  (226098) : Exiting Master process...
Jan 22 22:35:10 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [ALERT]    (226098) : Current worker (226100) exited with code 143 (Terminated)
Jan 22 22:35:10 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226083]: [WARNING]  (226098) : All workers exited. Exiting... (0)
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.598 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:10 compute-0 systemd[1]: libpod-e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857.scope: Deactivated successfully.
Jan 22 22:35:10 compute-0 podman[226234]: 2026-01-22 22:35:10.607692795 +0000 UTC m=+0.111035497 container died e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857-userdata-shm.mount: Deactivated successfully.
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.640 182729 DEBUG nova.virt.libvirt.vif [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:35:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.641 182729 DEBUG nova.network.os_vif_util [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.642 182729 DEBUG nova.network.os_vif_util [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-7aca0daf54bfa36cb0dd35b5b767598b04a5f732e1b58a032f63c28ceb5d62c5-merged.mount: Deactivated successfully.
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.643 182729 DEBUG os_vif [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:35:10 compute-0 podman[226234]: 2026-01-22 22:35:10.646220925 +0000 UTC m=+0.149563577 container cleanup e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.646 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.646 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafca4e40-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.648 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.650 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.655 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.660 182729 INFO os_vif [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94')
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.665 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.666 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:10 compute-0 systemd[1]: libpod-conmon-e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857.scope: Deactivated successfully.
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.671 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc.converted --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.672 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "2de6e157e1fcab398fb030cb85fc31760486f7cc" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.686 182729 DEBUG nova.objects.instance [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.688 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "2de6e157e1fcab398fb030cb85fc31760486f7cc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.688 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "2de6e157e1fcab398fb030cb85fc31760486f7cc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.698 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:10 compute-0 podman[226280]: 2026-01-22 22:35:10.73259438 +0000 UTC m=+0.056425132 container remove e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.738 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2f522163-cc5c-41aa-86e9-709577b03263]: (4, ('Thu Jan 22 10:35:10 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857)\ne28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857\nThu Jan 22 10:35:10 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (e28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857)\ne28a67e84992b8f72cc76e79d6488c3337ebddc284d9c93ba0e402ac5ff2c857\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.740 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8d3f77-59ce-4477-9582-a232542745b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.741 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.758 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.762 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2fa98d-aa9a-4089-b80d-3564ea82b07e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.772 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.773 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc,backing_fmt=raw /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.776 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[20e5eaf8-fe0d-4a6e-b0d3-b919e5867752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.778 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f67192d9-5ff6-407c-9137-b85c60b64a93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.803 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8df6cd49-fa03-406f-a337-26dcacb7a5cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494222, 'reachable_time': 21452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226305, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.807 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:35:10 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:35:10 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:10.808 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[5c35e49b-ac63-4709-8ced-4d97f0427060]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.817 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc,backing_fmt=raw /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.rescue" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.818 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "2de6e157e1fcab398fb030cb85fc31760486f7cc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.818 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'migration_context' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.831 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.833 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Start _get_guest_xml network_info=[{"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "vif_mac": "fa:16:3e:6e:7f:79"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '6962d84c-ff3a-4fbc-bfe5-4022aacf7fa1', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.834 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'resources' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.850 182729 DEBUG nova.compute.provider_tree [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.861 182729 WARNING nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.868 182729 DEBUG nova.scheduler.client.report [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.875 182729 DEBUG nova.virt.libvirt.host [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.876 182729 DEBUG nova.virt.libvirt.host [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.881 182729 DEBUG nova.virt.libvirt.host [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.882 182729 DEBUG nova.virt.libvirt.host [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.884 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.884 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.885 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.886 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.886 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.886 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.887 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.887 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.888 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.888 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.889 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.889 182729 DEBUG nova.virt.hardware [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.890 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.908 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.943 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.983 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.984 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.984 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.985 182729 DEBUG oslo_concurrency.lockutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.986 182729 DEBUG nova.virt.libvirt.vif [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-721180198',display_name='tempest-ServerStableDeviceRescueTest-server-721180198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-721180198',id=108,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-76kjeq8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:34:50Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=9dc942b5-8b65-4eb7-a57c-30d0a6221426,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "vif_mac": "fa:16:3e:6e:7f:79"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.987 182729 DEBUG nova.network.os_vif_util [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "vif_mac": "fa:16:3e:6e:7f:79"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.988 182729 DEBUG nova.network.os_vif_util [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:10 compute-0 nova_compute[182725]: 2026-01-22 22:35:10.989 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.004 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <uuid>9dc942b5-8b65-4eb7-a57c-30d0a6221426</uuid>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <name>instance-0000006c</name>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-721180198</nova:name>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:35:10</nova:creationTime>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:user uuid="9d1e26d3056148e692e157703469d77a">tempest-ServerStableDeviceRescueTest-395714292-project-member</nova:user>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:project uuid="9b1f07a8546648baba916fffc53a0b93">tempest-ServerStableDeviceRescueTest-395714292</nova:project>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         <nova:port uuid="03c52ba6-920a-4801-bb3e-c99e3203ab13">
Jan 22 22:35:11 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <system>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <entry name="serial">9dc942b5-8b65-4eb7-a57c-30d0a6221426</entry>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <entry name="uuid">9dc942b5-8b65-4eb7-a57c-30d0a6221426</entry>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </system>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <os>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   </os>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <features>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   </features>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.rescue"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <target dev="sdb" bus="usb"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <boot order="1"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:6e:7f:79"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <target dev="tap03c52ba6-92"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/console.log" append="off"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <video>
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </video>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:35:11 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:35:11 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:35:11 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:35:11 compute-0 nova_compute[182725]: </domain>
Jan 22 22:35:11 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.025 182729 INFO nova.virt.libvirt.driver [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance destroyed successfully.
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.108 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.109 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.109 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.110 182729 DEBUG nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No VIF found with MAC fa:16:3e:6e:7f:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.110 182729 INFO nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Using config drive
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.114 182729 INFO nova.compute.manager [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Swapping old allocation on dict_keys(['4f7db789-7f4b-4901-9c88-ecf66d0aff43']) held by migration adef7b22-ad57-4851-ab9d-b816513343b5 for instance
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.127 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.147 182729 DEBUG nova.scheduler.client.report [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Overwriting current allocation {'allocations': {'4f7db789-7f4b-4901-9c88-ecf66d0aff43': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 63}}, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'user_id': '97ae504d8c4f43529c360266766791d0', 'consumer_generation': 1} on consumer def087b6-47f2-4914-8d90-1e4426e4da0a move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.152 182729 DEBUG nova.objects.instance [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'keypairs' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.361 182729 DEBUG nova.compute.manager [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-unplugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.362 182729 DEBUG oslo_concurrency.lockutils [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.363 182729 DEBUG oslo_concurrency.lockutils [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.364 182729 DEBUG oslo_concurrency.lockutils [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.364 182729 DEBUG nova.compute.manager [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-unplugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.365 182729 WARNING nova.compute.manager [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-unplugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state resized and task_state resize_reverting.
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.365 182729 DEBUG nova.compute.manager [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.366 182729 DEBUG oslo_concurrency.lockutils [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.366 182729 DEBUG oslo_concurrency.lockutils [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.366 182729 DEBUG oslo_concurrency.lockutils [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.367 182729 DEBUG nova.compute.manager [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.367 182729 WARNING nova.compute.manager [req-f9416177-ac4e-40b7-b9ea-027270098883 req-70c78593-211a-42aa-8367-366500b9142d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state resized and task_state resize_reverting.
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.376 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.376 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.377 182729 DEBUG nova.network.neutron [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.507 182729 INFO nova.virt.libvirt.driver [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Creating config drive at /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config.rescue
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.517 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx93k9tz3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.656 182729 DEBUG oslo_concurrency.processutils [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx93k9tz3" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:11 compute-0 kernel: tap03c52ba6-92: entered promiscuous mode
Jan 22 22:35:11 compute-0 systemd-udevd[226306]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.758 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:11 compute-0 ovn_controller[94850]: 2026-01-22T22:35:11Z|00394|binding|INFO|Claiming lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 for this chassis.
Jan 22 22:35:11 compute-0 ovn_controller[94850]: 2026-01-22T22:35:11Z|00395|binding|INFO|03c52ba6-920a-4801-bb3e-c99e3203ab13: Claiming fa:16:3e:6e:7f:79 10.100.0.5
Jan 22 22:35:11 compute-0 NetworkManager[54954]: <info>  [1769121311.7610] manager: (tap03c52ba6-92): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Jan 22 22:35:11 compute-0 NetworkManager[54954]: <info>  [1769121311.7720] device (tap03c52ba6-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:35:11 compute-0 NetworkManager[54954]: <info>  [1769121311.7742] device (tap03c52ba6-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:35:11 compute-0 ovn_controller[94850]: 2026-01-22T22:35:11Z|00396|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 ovn-installed in OVS
Jan 22 22:35:11 compute-0 ovn_controller[94850]: 2026-01-22T22:35:11Z|00397|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 up in Southbound
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.785 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:7f:79 10.100.0.5'], port_security=['fa:16:3e:6e:7f:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '5', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=03c52ba6-920a-4801-bb3e-c99e3203ab13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.786 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 03c52ba6-920a-4801-bb3e-c99e3203ab13 in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 bound to our chassis
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.787 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.789 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:11 compute-0 nova_compute[182725]: 2026-01-22 22:35:11.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.798 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1b967c9b-bd6c-4050-88d4-afa899ccb634]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.798 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2345e3-01 in ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.800 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2345e3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.801 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea4a72e-3a5d-462e-9faf-9b97490403b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.801 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fa07ee9e-6ac7-426a-b1de-02a063012c83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.818 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[21fbfaad-7df1-4341-8055-f386e4154570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 systemd-machined[154006]: New machine qemu-46-instance-0000006c.
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.836 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ef2f411d-23c1-40dd-a804-3be880910aa4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-0000006c.
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.873 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6af2dc-abf5-4c5e-a847-4f1280705918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.879 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82c0a426-e723-4d05-9d98-3325c6a80e6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 NetworkManager[54954]: <info>  [1769121311.8809] manager: (tapad2345e3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.922 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2589f8-3d97-4209-9770-a835c336700a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.926 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c89c0022-fbf1-4448-8a47-02b54dd30914]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 NetworkManager[54954]: <info>  [1769121311.9568] device (tapad2345e3-00): carrier: link connected
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.965 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b2f8cc-9384-4d95-8792-8d3625fc0527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:11 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:11.986 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[29af36dd-a2db-4351-ba38-29849e7775a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494955, 'reachable_time': 18576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226366, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.013 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b967581a-387c-4226-b45e-ddd75b97c892]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:33c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494955, 'tstamp': 494955}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226367, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.036 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7f85a158-3f9b-45ef-a804-2bb51e500160]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494955, 'reachable_time': 18576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226368, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.073 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ccede8c8-2dbf-41ac-ad14-a34c2069e8ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.148 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 9dc942b5-8b65-4eb7-a57c-30d0a6221426 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.148 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121312.1474838, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.149 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Resumed (Lifecycle Event)
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.154 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fa6651ed-a829-480a-beff-ba8bb3490f38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.158 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.158 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.158 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2345e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:12 compute-0 NetworkManager[54954]: <info>  [1769121312.1613] manager: (tapad2345e3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 22 22:35:12 compute-0 kernel: tapad2345e3-00: entered promiscuous mode
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.166 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.166 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2345e3-00, col_values=(('external_ids', {'iface-id': 'bd160f04-1c71-4851-91cb-64d88f335d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:12 compute-0 ovn_controller[94850]: 2026-01-22T22:35:12Z|00398|binding|INFO|Releasing lport bd160f04-1c71-4851-91cb-64d88f335d22 from this chassis (sb_readonly=0)
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.170 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.171 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f0191322-cd16-4915-96f8-fd1d103fab4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.173 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.174 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'env', 'PROCESS_TAG=haproxy-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2345e3-0b74-4aee-aa42-da6620725bb2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.174 182729 DEBUG nova.compute.manager [None req-7de92590-3b44-4f74-b577-45a3ede0a9b6 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.179 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.179 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.188 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.223 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.223 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121312.1492615, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.224 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Started (Lifecycle Event)
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.277 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:12 compute-0 nova_compute[182725]: 2026-01-22 22:35:12.282 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.443 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.444 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:12.445 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:12 compute-0 podman[226407]: 2026-01-22 22:35:12.585912407 +0000 UTC m=+0.052465882 container create b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:12 compute-0 systemd[1]: Started libpod-conmon-b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907.scope.
Jan 22 22:35:12 compute-0 podman[226407]: 2026-01-22 22:35:12.556469256 +0000 UTC m=+0.023022751 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:35:12 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:35:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a45a551051657db443545bd49d039040e37dbfa0d211d91ffb465be6af1125d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:35:12 compute-0 podman[226407]: 2026-01-22 22:35:12.675880853 +0000 UTC m=+0.142434348 container init b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 22:35:12 compute-0 podman[226407]: 2026-01-22 22:35:12.683454653 +0000 UTC m=+0.150008168 container start b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 22:35:12 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [NOTICE]   (226427) : New worker (226429) forked
Jan 22 22:35:12 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [NOTICE]   (226427) : Loading success.
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.299 182729 DEBUG nova.network.neutron [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.638 182729 DEBUG nova.compute.manager [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.639 182729 DEBUG oslo_concurrency.lockutils [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.640 182729 DEBUG oslo_concurrency.lockutils [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.640 182729 DEBUG oslo_concurrency.lockutils [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.640 182729 DEBUG nova.compute.manager [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.641 182729 WARNING nova.compute.manager [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state rescued and task_state None.
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.641 182729 DEBUG nova.compute.manager [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.642 182729 DEBUG oslo_concurrency.lockutils [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.642 182729 DEBUG oslo_concurrency.lockutils [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.643 182729 DEBUG oslo_concurrency.lockutils [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.643 182729 DEBUG nova.compute.manager [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.644 182729 WARNING nova.compute.manager [req-04b9e6f3-6769-49c8-affa-5306a7864fa1 req-f4baae23-72c1-4cac-940e-f46d1b2d3234 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state rescued and task_state None.
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.649 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.650 182729 DEBUG nova.virt.libvirt.driver [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.667 182729 DEBUG nova.virt.libvirt.driver [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Start _get_guest_xml network_info=[{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.675 182729 WARNING nova.virt.libvirt.driver [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.683 182729 DEBUG nova.virt.libvirt.host [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.684 182729 DEBUG nova.virt.libvirt.host [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.689 182729 DEBUG nova.virt.libvirt.host [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.690 182729 DEBUG nova.virt.libvirt.host [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.693 182729 DEBUG nova.virt.libvirt.driver [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.693 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.694 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.694 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.695 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.695 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.696 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.696 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.697 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.697 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.698 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.698 182729 DEBUG nova.virt.hardware [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.699 182729 DEBUG nova.objects.instance [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'vcpu_model' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.722 182729 DEBUG oslo_concurrency.processutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.789 182729 DEBUG oslo_concurrency.processutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.792 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.792 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.794 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.796 182729 DEBUG nova.virt.libvirt.vif [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:35:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.797 182729 DEBUG nova.network.os_vif_util [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.799 182729 DEBUG nova.network.os_vif_util [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.804 182729 DEBUG nova.virt.libvirt.driver [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <uuid>def087b6-47f2-4914-8d90-1e4426e4da0a</uuid>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <name>instance-00000069</name>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-386387448</nova:name>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:35:13</nova:creationTime>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         <nova:port uuid="afca4e40-9413-4bf5-91f1-2208d0fb0153">
Jan 22 22:35:13 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <system>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <entry name="serial">def087b6-47f2-4914-8d90-1e4426e4da0a</entry>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <entry name="uuid">def087b6-47f2-4914-8d90-1e4426e4da0a</entry>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </system>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <os>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   </os>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <features>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   </features>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/disk.config"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:fa:b4:6a"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <target dev="tapafca4e40-94"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a/console.log" append="off"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <video>
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </video>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:35:13 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:35:13 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:35:13 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:35:13 compute-0 nova_compute[182725]: </domain>
Jan 22 22:35:13 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.806 182729 DEBUG nova.compute.manager [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Preparing to wait for external event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.807 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.807 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.807 182729 DEBUG oslo_concurrency.lockutils [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.809 182729 DEBUG nova.virt.libvirt.vif [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:35:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.809 182729 DEBUG nova.network.os_vif_util [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.810 182729 DEBUG nova.network.os_vif_util [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.811 182729 DEBUG os_vif [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.812 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.813 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.814 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.818 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.819 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafca4e40-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.820 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapafca4e40-94, col_values=(('external_ids', {'iface-id': 'afca4e40-9413-4bf5-91f1-2208d0fb0153', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:b4:6a', 'vm-uuid': 'def087b6-47f2-4914-8d90-1e4426e4da0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.822 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:13 compute-0 NetworkManager[54954]: <info>  [1769121313.8244] manager: (tapafca4e40-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.825 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.830 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.831 182729 INFO os_vif [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94')
Jan 22 22:35:13 compute-0 kernel: tapafca4e40-94: entered promiscuous mode
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.924 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:13 compute-0 ovn_controller[94850]: 2026-01-22T22:35:13Z|00399|binding|INFO|Claiming lport afca4e40-9413-4bf5-91f1-2208d0fb0153 for this chassis.
Jan 22 22:35:13 compute-0 ovn_controller[94850]: 2026-01-22T22:35:13Z|00400|binding|INFO|afca4e40-9413-4bf5-91f1-2208d0fb0153: Claiming fa:16:3e:fa:b4:6a 10.100.0.14
Jan 22 22:35:13 compute-0 NetworkManager[54954]: <info>  [1769121313.9273] manager: (tapafca4e40-94): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:13 compute-0 ovn_controller[94850]: 2026-01-22T22:35:13Z|00401|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 ovn-installed in OVS
Jan 22 22:35:13 compute-0 ovn_controller[94850]: 2026-01-22T22:35:13Z|00402|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 up in Southbound
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.941 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:b4:6a 10.100.0.14'], port_security=['fa:16:3e:fa:b4:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'def087b6-47f2-4914-8d90-1e4426e4da0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=afca4e40-9413-4bf5-91f1-2208d0fb0153) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.943 104215 INFO neutron.agent.ovn.metadata.agent [-] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.944 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:13 compute-0 nova_compute[182725]: 2026-01-22 22:35:13.954 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.957 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[310ed80a-72da-4f31-a67c-864edd69b395]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.958 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.960 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.960 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[57b8e7da-f328-4db1-a0ab-f5ebc4b429ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.961 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a56a4314-2f25-4323-9585-c09a367fafdf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:13 compute-0 systemd-udevd[226456]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:35:13 compute-0 NetworkManager[54954]: <info>  [1769121313.9761] device (tapafca4e40-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:35:13 compute-0 NetworkManager[54954]: <info>  [1769121313.9771] device (tapafca4e40-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.978 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[664f467d-9bad-4e0b-8c6d-d2e30d596c61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:13 compute-0 systemd-machined[154006]: New machine qemu-47-instance-00000069.
Jan 22 22:35:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:13.995 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbf982f-38bb-4401-a1cc-f0dcfb102fd4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-00000069.
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.025 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4b87f4-dd86-41a5-9c18-c505df35a24d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 NetworkManager[54954]: <info>  [1769121314.0329] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Jan 22 22:35:14 compute-0 systemd-udevd[226461]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.034 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d75d2556-abb0-4ab0-8fb4-b38705d27714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.045 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.082 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a6cdbf48-b2b8-4c5f-8079-979124d2209d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.086 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9dae931d-1d68-45ba-a245-becb497af0fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 NetworkManager[54954]: <info>  [1769121314.1121] device (tape65877e5-00): carrier: link connected
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.120 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[04fd8910-c5f9-46d4-ad63-d1421418de81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.139 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fd90fbd1-c4a7-4395-8dd6-98230a1542bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495171, 'reachable_time': 34062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226491, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.161 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3f587fec-7ca1-4b3c-8361-20ee301c113b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495171, 'tstamp': 495171}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226492, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.183 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0ded8fb3-861e-4343-be67-53bd51ec26e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495171, 'reachable_time': 34062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226493, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.236 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c24a6b2d-13aa-42dc-af6d-3802b0e0e3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.316 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0af70392-48d9-4dd8-8792-fcd06edce7f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.318 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.318 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.319 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:14 compute-0 NetworkManager[54954]: <info>  [1769121314.3223] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.322 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:14 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.325 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.331 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.332 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.333 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:14 compute-0 ovn_controller[94850]: 2026-01-22T22:35:14Z|00403|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.345 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.346 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.347 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3495a224-59f4-4544-8085-bb03935c8670]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.348 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:35:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:14.349 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.620 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for def087b6-47f2-4914-8d90-1e4426e4da0a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.621 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121314.619736, def087b6-47f2-4914-8d90-1e4426e4da0a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:14 compute-0 nova_compute[182725]: 2026-01-22 22:35:14.621 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Started (Lifecycle Event)
Jan 22 22:35:14 compute-0 podman[226532]: 2026-01-22 22:35:14.807214462 +0000 UTC m=+0.059181361 container create bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:14 compute-0 systemd[1]: Started libpod-conmon-bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085.scope.
Jan 22 22:35:14 compute-0 podman[226532]: 2026-01-22 22:35:14.778473619 +0000 UTC m=+0.030440598 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:35:14 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:35:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dfddf5bcc69682a17430b54578971d32057b5ff0280a14fbd1817eb33e5ac3bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:35:14 compute-0 podman[226532]: 2026-01-22 22:35:14.930695492 +0000 UTC m=+0.182662381 container init bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 22:35:14 compute-0 podman[226532]: 2026-01-22 22:35:14.936340085 +0000 UTC m=+0.188306974 container start bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 22:35:14 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226547]: [NOTICE]   (226551) : New worker (226553) forked
Jan 22 22:35:14 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226547]: [NOTICE]   (226551) : Loading success.
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.359 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.365 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121314.6202128, def087b6-47f2-4914-8d90-1e4426e4da0a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.366 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Paused (Lifecycle Event)
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.386 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.400 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.427 182729 INFO nova.compute.manager [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Unrescuing
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.429 182729 DEBUG oslo_concurrency.lockutils [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.429 182729 DEBUG oslo_concurrency.lockutils [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquired lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.430 182729 DEBUG nova.network.neutron [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.431 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.802 182729 DEBUG nova.compute.manager [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.803 182729 DEBUG oslo_concurrency.lockutils [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.803 182729 DEBUG oslo_concurrency.lockutils [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.804 182729 DEBUG oslo_concurrency.lockutils [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.804 182729 DEBUG nova.compute.manager [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Processing event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.804 182729 DEBUG nova.compute.manager [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.805 182729 DEBUG oslo_concurrency.lockutils [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.805 182729 DEBUG oslo_concurrency.lockutils [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.805 182729 DEBUG oslo_concurrency.lockutils [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.805 182729 DEBUG nova.compute.manager [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] No waiting events found dispatching network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.806 182729 WARNING nova.compute.manager [req-fae8d1ef-20a8-4560-87f8-904d40e4a4bb req-9c3b818a-b315-4954-8b1c-8fb93596b2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received unexpected event network-vif-plugged-afca4e40-9413-4bf5-91f1-2208d0fb0153 for instance with vm_state resized and task_state resize_reverting.
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.807 182729 DEBUG nova.compute.manager [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.816 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121315.8115745, def087b6-47f2-4914-8d90-1e4426e4da0a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.817 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Resumed (Lifecycle Event)
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.822 182729 INFO nova.virt.libvirt.driver [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance running successfully.
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.823 182729 DEBUG nova.virt.libvirt.driver [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.838 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.842 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.890 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 22 22:35:15 compute-0 nova_compute[182725]: 2026-01-22 22:35:15.942 182729 INFO nova.compute.manager [None req-a5313b5c-d388-4426-908a-352ab5a5aad7 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance to original state: 'active'
Jan 22 22:35:16 compute-0 nova_compute[182725]: 2026-01-22 22:35:16.804 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:16 compute-0 nova_compute[182725]: 2026-01-22 22:35:16.910 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:18 compute-0 nova_compute[182725]: 2026-01-22 22:35:18.822 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:19 compute-0 nova_compute[182725]: 2026-01-22 22:35:19.047 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:19 compute-0 podman[226563]: 2026-01-22 22:35:19.130299731 +0000 UTC m=+0.063886110 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:35:20 compute-0 nova_compute[182725]: 2026-01-22 22:35:20.664 182729 DEBUG nova.network.neutron [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Updating instance_info_cache with network_info: [{"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:21 compute-0 nova_compute[182725]: 2026-01-22 22:35:21.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:21 compute-0 nova_compute[182725]: 2026-01-22 22:35:21.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:35:21 compute-0 nova_compute[182725]: 2026-01-22 22:35:21.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.002 182729 DEBUG oslo_concurrency.lockutils [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Releasing lock "refresh_cache-9dc942b5-8b65-4eb7-a57c-30d0a6221426" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.003 182729 DEBUG nova.objects.instance [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'flavor' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:22 compute-0 kernel: tap03c52ba6-92 (unregistering): left promiscuous mode
Jan 22 22:35:22 compute-0 NetworkManager[54954]: <info>  [1769121322.0597] device (tap03c52ba6-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.069 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00404|binding|INFO|Releasing lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 from this chassis (sb_readonly=0)
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00405|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 down in Southbound
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00406|binding|INFO|Removing iface tap03c52ba6-92 ovn-installed in OVS
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.071 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.083 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.098 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:7f:79 10.100.0.5'], port_security=['fa:16:3e:6e:7f:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '6', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=03c52ba6-920a-4801-bb3e-c99e3203ab13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.100 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 03c52ba6-920a-4801-bb3e-c99e3203ab13 in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.102 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.103 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae6ee20-7804-48d9-9cbd-c35de92af5a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.104 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace which is not needed anymore
Jan 22 22:35:22 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Jan 22 22:35:22 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006c.scope: Consumed 10.355s CPU time.
Jan 22 22:35:22 compute-0 systemd-machined[154006]: Machine qemu-46-instance-0000006c terminated.
Jan 22 22:35:22 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [NOTICE]   (226427) : haproxy version is 2.8.14-c23fe91
Jan 22 22:35:22 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [NOTICE]   (226427) : path to executable is /usr/sbin/haproxy
Jan 22 22:35:22 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [WARNING]  (226427) : Exiting Master process...
Jan 22 22:35:22 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [WARNING]  (226427) : Exiting Master process...
Jan 22 22:35:22 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [ALERT]    (226427) : Current worker (226429) exited with code 143 (Terminated)
Jan 22 22:35:22 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226423]: [WARNING]  (226427) : All workers exited. Exiting... (0)
Jan 22 22:35:22 compute-0 systemd[1]: libpod-b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907.scope: Deactivated successfully.
Jan 22 22:35:22 compute-0 conmon[226423]: conmon b0013f12988e4525b0a5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907.scope/container/memory.events
Jan 22 22:35:22 compute-0 podman[226608]: 2026-01-22 22:35:22.256748766 +0000 UTC m=+0.055088848 container died b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:35:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907-userdata-shm.mount: Deactivated successfully.
Jan 22 22:35:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-a45a551051657db443545bd49d039040e37dbfa0d211d91ffb465be6af1125d3-merged.mount: Deactivated successfully.
Jan 22 22:35:22 compute-0 podman[226608]: 2026-01-22 22:35:22.314176382 +0000 UTC m=+0.112516434 container cleanup b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:22 compute-0 systemd[1]: libpod-conmon-b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907.scope: Deactivated successfully.
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.322 182729 INFO nova.virt.libvirt.driver [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance destroyed successfully.
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.323 182729 DEBUG nova.objects.instance [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:22 compute-0 podman[226659]: 2026-01-22 22:35:22.398315111 +0000 UTC m=+0.050488002 container remove b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.409 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4392ec-a8cd-48c5-8368-6e80b8965d73]: (4, ('Thu Jan 22 10:35:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907)\nb0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907\nThu Jan 22 10:35:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (b0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907)\nb0013f12988e4525b0a54f3ea986fbf220fe718ab7acec20df805927df7a3907\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.412 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0e44149c-1082-43e7-a38e-f5c7d5ecc462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.413 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.415 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 systemd-udevd[226585]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:35:22 compute-0 NetworkManager[54954]: <info>  [1769121322.4184] manager: (tap03c52ba6-92): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Jan 22 22:35:22 compute-0 kernel: tapad2345e3-00: left promiscuous mode
Jan 22 22:35:22 compute-0 kernel: tap03c52ba6-92: entered promiscuous mode
Jan 22 22:35:22 compute-0 NetworkManager[54954]: <info>  [1769121322.4329] device (tap03c52ba6-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:35:22 compute-0 NetworkManager[54954]: <info>  [1769121322.4340] device (tap03c52ba6-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00407|binding|INFO|Claiming lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 for this chassis.
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00408|binding|INFO|03c52ba6-920a-4801-bb3e-c99e3203ab13: Claiming fa:16:3e:6e:7f:79 10.100.0.5
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.437 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.441 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[639c99ae-6dee-4ebc-862b-b4cb257a6bf6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.445 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:7f:79 10.100.0.5'], port_security=['fa:16:3e:6e:7f:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '6', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=03c52ba6-920a-4801-bb3e-c99e3203ab13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00409|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 ovn-installed in OVS
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00410|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 up in Southbound
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.455 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.460 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ffbab643-8daa-4440-a43d-90ac373ad2dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.462 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0bdcfa8f-665e-43f0-a04b-212cde837c0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 systemd-machined[154006]: New machine qemu-48-instance-0000006c.
Jan 22 22:35:22 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-0000006c.
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.476 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[16a7aa14-fb52-4a3d-9b12-5cae777bada6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494946, 'reachable_time': 43788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226689, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 systemd[1]: run-netns-ovnmeta\x2dad2345e3\x2d0b74\x2d4aee\x2daa42\x2dda6620725bb2.mount: Deactivated successfully.
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.485 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.485 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4572bbb4-f986-49f1-9bd8-71b0ed8eaad2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.486 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 03c52ba6-920a-4801-bb3e-c99e3203ab13 in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.488 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.501 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[68bf4c23-2ad6-4b64-aa85-190f0716ef49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.503 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2345e3-01 in ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.505 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2345e3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.505 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[36c6fcd2-3c9f-4d94-ab58-844bee7b73fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.506 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[699c3284-0cf0-42e4-95a6-8995c8e3bd8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.519 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9ed1b5-40c5-4a76-bc91-6c22265ff460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.550 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d3ab75-441f-4a97-992b-e9a1c3f9e9a6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.578 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6d8d653a-3d1a-489b-8176-cbf2f0c8ee28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.585 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fe54997a-37fd-490b-8b30-3061e922064c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 NetworkManager[54954]: <info>  [1769121322.5875] manager: (tapad2345e3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/196)
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.621 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7a699347-32c0-48e1-a870-a81e9b64b4de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.626 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[de1f967c-d727-45ec-a810-3884c4147ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 NetworkManager[54954]: <info>  [1769121322.6500] device (tapad2345e3-00): carrier: link connected
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.655 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3b92164f-bedd-4006-8ec6-7735bf79c54c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.676 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0439f7-e6fc-4078-8122-fbdf4b85c1c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496025, 'reachable_time': 37254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226727, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.692 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[213a117d-3a57-46d9-8c4b-3d28910eb75c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:33c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496025, 'tstamp': 496025}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226728, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.695 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.695 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.696 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.696 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.711 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7d73ddb0-8e10-470c-878b-eeda556e9abb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496025, 'reachable_time': 37254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226729, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.739 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[44a4936c-28ca-4ede-b853-ee37ac5d47bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.751 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 9dc942b5-8b65-4eb7-a57c-30d0a6221426 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.751 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121322.7508173, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.752 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Resumed (Lifecycle Event)
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.754 182729 DEBUG nova.compute.manager [None req-e5107600-6faa-48cf-93d7-13e58334ee99 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.801 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c4867962-fbee-4042-9157-83c6cb6f194e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.803 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.803 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.803 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.804 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2345e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.805 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 NetworkManager[54954]: <info>  [1769121322.8064] manager: (tapad2345e3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 22 22:35:22 compute-0 kernel: tapad2345e3-00: entered promiscuous mode
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.808 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.809 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2345e3-00, col_values=(('external_ids', {'iface-id': 'bd160f04-1c71-4851-91cb-64d88f335d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.809 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:22 compute-0 ovn_controller[94850]: 2026-01-22T22:35:22Z|00411|binding|INFO|Releasing lport bd160f04-1c71-4851-91cb-64d88f335d22 from this chassis (sb_readonly=0)
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.811 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.821 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.821 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.822 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[001ac2f4-bf2b-4888-83a0-bc37246b9db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.823 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:35:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:22.823 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'env', 'PROCESS_TAG=haproxy-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2345e3-0b74-4aee-aa42-da6620725bb2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.852 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121322.7512424, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:22 compute-0 nova_compute[182725]: 2026-01-22 22:35:22.853 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Started (Lifecycle Event)
Jan 22 22:35:23 compute-0 podman[226763]: 2026-01-22 22:35:23.228987147 +0000 UTC m=+0.067720066 container create f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:23 compute-0 systemd[1]: Started libpod-conmon-f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba.scope.
Jan 22 22:35:23 compute-0 podman[226763]: 2026-01-22 22:35:23.187950454 +0000 UTC m=+0.026683383 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:35:23 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:35:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e8b88bef2adf302a173b6957c33fbebb042392c4213094c48f230deb845ffbf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:35:23 compute-0 podman[226763]: 2026-01-22 22:35:23.341551901 +0000 UTC m=+0.180284840 container init f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:35:23 compute-0 podman[226763]: 2026-01-22 22:35:23.347652975 +0000 UTC m=+0.186385884 container start f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:35:23 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226779]: [NOTICE]   (226783) : New worker (226785) forked
Jan 22 22:35:23 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226779]: [NOTICE]   (226783) : Loading success.
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.451 182729 DEBUG nova.compute.manager [req-e60c5439-1205-4164-acf8-4160ea650b4e req-2273bf21-6362-4ff7-8534-eecfeb04818d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.452 182729 DEBUG oslo_concurrency.lockutils [req-e60c5439-1205-4164-acf8-4160ea650b4e req-2273bf21-6362-4ff7-8534-eecfeb04818d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.452 182729 DEBUG oslo_concurrency.lockutils [req-e60c5439-1205-4164-acf8-4160ea650b4e req-2273bf21-6362-4ff7-8534-eecfeb04818d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.453 182729 DEBUG oslo_concurrency.lockutils [req-e60c5439-1205-4164-acf8-4160ea650b4e req-2273bf21-6362-4ff7-8534-eecfeb04818d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.453 182729 DEBUG nova.compute.manager [req-e60c5439-1205-4164-acf8-4160ea650b4e req-2273bf21-6362-4ff7-8534-eecfeb04818d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.453 182729 WARNING nova.compute.manager [req-e60c5439-1205-4164-acf8-4160ea650b4e req-2273bf21-6362-4ff7-8534-eecfeb04818d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state active and task_state None.
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.483 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.489 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:23 compute-0 nova_compute[182725]: 2026-01-22 22:35:23.824 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:24 compute-0 nova_compute[182725]: 2026-01-22 22:35:24.051 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:25 compute-0 podman[226795]: 2026-01-22 22:35:25.187941395 +0000 UTC m=+0.107023946 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 22:35:25 compute-0 podman[226794]: 2026-01-22 22:35:25.226719441 +0000 UTC m=+0.141570736 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true)
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.258 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.259 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.259 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.260 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.260 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.275 182729 INFO nova.compute.manager [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Terminating instance
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.289 182729 DEBUG nova.compute.manager [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:35:25 compute-0 kernel: tapafca4e40-94 (unregistering): left promiscuous mode
Jan 22 22:35:25 compute-0 NetworkManager[54954]: <info>  [1769121325.3114] device (tapafca4e40-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:35:25 compute-0 ovn_controller[94850]: 2026-01-22T22:35:25Z|00412|binding|INFO|Releasing lport afca4e40-9413-4bf5-91f1-2208d0fb0153 from this chassis (sb_readonly=0)
Jan 22 22:35:25 compute-0 ovn_controller[94850]: 2026-01-22T22:35:25Z|00413|binding|INFO|Setting lport afca4e40-9413-4bf5-91f1-2208d0fb0153 down in Southbound
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:25 compute-0 ovn_controller[94850]: 2026-01-22T22:35:25Z|00414|binding|INFO|Removing iface tapafca4e40-94 ovn-installed in OVS
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.327 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:25 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000069.scope: Deactivated successfully.
Jan 22 22:35:25 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000069.scope: Consumed 10.495s CPU time.
Jan 22 22:35:25 compute-0 systemd-machined[154006]: Machine qemu-47-instance-00000069 terminated.
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.552 182729 INFO nova.virt.libvirt.driver [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Instance destroyed successfully.
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.553 182729 DEBUG nova.objects.instance [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid def087b6-47f2-4914-8d90-1e4426e4da0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.693 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:b4:6a 10.100.0.14'], port_security=['fa:16:3e:fa:b4:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'def087b6-47f2-4914-8d90-1e4426e4da0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=afca4e40-9413-4bf5-91f1-2208d0fb0153) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.694 104215 INFO neutron.agent.ovn.metadata.agent [-] Port afca4e40-9413-4bf5-91f1-2208d0fb0153 in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.695 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.697 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ed13078a-688c-4318-b74a-cbb49e95cf9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.698 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:35:25 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226547]: [NOTICE]   (226551) : haproxy version is 2.8.14-c23fe91
Jan 22 22:35:25 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226547]: [NOTICE]   (226551) : path to executable is /usr/sbin/haproxy
Jan 22 22:35:25 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226547]: [WARNING]  (226551) : Exiting Master process...
Jan 22 22:35:25 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226547]: [ALERT]    (226551) : Current worker (226553) exited with code 143 (Terminated)
Jan 22 22:35:25 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[226547]: [WARNING]  (226551) : All workers exited. Exiting... (0)
Jan 22 22:35:25 compute-0 systemd[1]: libpod-bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085.scope: Deactivated successfully.
Jan 22 22:35:25 compute-0 podman[226877]: 2026-01-22 22:35:25.833121871 +0000 UTC m=+0.046889472 container died bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:35:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085-userdata-shm.mount: Deactivated successfully.
Jan 22 22:35:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-dfddf5bcc69682a17430b54578971d32057b5ff0280a14fbd1817eb33e5ac3bc-merged.mount: Deactivated successfully.
Jan 22 22:35:25 compute-0 podman[226877]: 2026-01-22 22:35:25.873129638 +0000 UTC m=+0.086897199 container cleanup bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:35:25 compute-0 systemd[1]: libpod-conmon-bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085.scope: Deactivated successfully.
Jan 22 22:35:25 compute-0 podman[226906]: 2026-01-22 22:35:25.930823121 +0000 UTC m=+0.038946982 container remove bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.936 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ee5494e5-5da1-4e84-bc75-6ab84273fb57]: (4, ('Thu Jan 22 10:35:25 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085)\nbd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085\nThu Jan 22 10:35:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (bd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085)\nbd4dae235abe7ce1410e7986e0c8fbc9fa574a501d1cd901709bd545053af085\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.938 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[55e0562d-833f-47d9-a794-b79fa953fdd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.940 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.942 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:25 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:35:25 compute-0 nova_compute[182725]: 2026-01-22 22:35:25.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.966 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f39062cd-82b3-4a2d-809d-d0ffe4769add]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.980 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a09643cc-8947-44e3-81a5-c581d35c3d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:25.982 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fb463176-e149-489a-bf46-6c1c79aadd1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:26.000 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3314e906-cf6f-4466-a972-dc28bbe76fad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495162, 'reachable_time': 30608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226924, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:26 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:35:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:26.005 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:35:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:26.005 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[26ba9e55-00eb-4674-9d12-f3e295df0487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.338 182729 DEBUG nova.compute.manager [req-cf028ea8-3456-49b2-9d9d-b68459437661 req-dbb5b5a0-6018-4058-9c4c-c80cb8a38dc0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.338 182729 DEBUG oslo_concurrency.lockutils [req-cf028ea8-3456-49b2-9d9d-b68459437661 req-dbb5b5a0-6018-4058-9c4c-c80cb8a38dc0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.339 182729 DEBUG oslo_concurrency.lockutils [req-cf028ea8-3456-49b2-9d9d-b68459437661 req-dbb5b5a0-6018-4058-9c4c-c80cb8a38dc0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.339 182729 DEBUG oslo_concurrency.lockutils [req-cf028ea8-3456-49b2-9d9d-b68459437661 req-dbb5b5a0-6018-4058-9c4c-c80cb8a38dc0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.339 182729 DEBUG nova.compute.manager [req-cf028ea8-3456-49b2-9d9d-b68459437661 req-dbb5b5a0-6018-4058-9c4c-c80cb8a38dc0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.339 182729 WARNING nova.compute.manager [req-cf028ea8-3456-49b2-9d9d-b68459437661 req-dbb5b5a0-6018-4058-9c4c-c80cb8a38dc0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state active and task_state None.
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.367 182729 DEBUG nova.virt.libvirt.vif [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-386387448',display_name='tempest-ServerActionsTestJSON-server-386387448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-386387448',id=105,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-8g9ztr6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:35:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=def087b6-47f2-4914-8d90-1e4426e4da0a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.368 182729 DEBUG nova.network.os_vif_util [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.369 182729 DEBUG nova.network.os_vif_util [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.369 182729 DEBUG os_vif [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.370 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.371 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafca4e40-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.372 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.374 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.377 182729 INFO os_vif [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:b4:6a,bridge_name='br-int',has_traffic_filtering=True,id=afca4e40-9413-4bf5-91f1-2208d0fb0153,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafca4e40-94')
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.378 182729 INFO nova.virt.libvirt.driver [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Deleting instance files /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_del
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.379 182729 INFO nova.virt.libvirt.driver [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Deletion of /var/lib/nova/instances/def087b6-47f2-4914-8d90-1e4426e4da0a_del complete
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.467 182729 INFO nova.compute.manager [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Took 1.18 seconds to destroy the instance on the hypervisor.
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.467 182729 DEBUG oslo.service.loopingcall [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.468 182729 DEBUG nova.compute.manager [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:35:26 compute-0 nova_compute[182725]: 2026-01-22 22:35:26.468 182729 DEBUG nova.network.neutron [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:35:27 compute-0 nova_compute[182725]: 2026-01-22 22:35:27.786 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [{"id": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "address": "fa:16:3e:fa:b4:6a", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafca4e40-94", "ovs_interfaceid": "afca4e40-9413-4bf5-91f1-2208d0fb0153", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.451 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-def087b6-47f2-4914-8d90-1e4426e4da0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.452 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.453 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.453 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.454 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.454 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.454 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.455 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.482 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.482 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.483 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.483 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.564 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.589 182729 DEBUG nova.compute.manager [req-6d19d9af-d66b-4fd3-adad-d6249baa8cdf req-cfddc881-9c58-4544-a9ae-787eb152c766 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.590 182729 DEBUG oslo_concurrency.lockutils [req-6d19d9af-d66b-4fd3-adad-d6249baa8cdf req-cfddc881-9c58-4544-a9ae-787eb152c766 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.590 182729 DEBUG oslo_concurrency.lockutils [req-6d19d9af-d66b-4fd3-adad-d6249baa8cdf req-cfddc881-9c58-4544-a9ae-787eb152c766 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.590 182729 DEBUG oslo_concurrency.lockutils [req-6d19d9af-d66b-4fd3-adad-d6249baa8cdf req-cfddc881-9c58-4544-a9ae-787eb152c766 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.590 182729 DEBUG nova.compute.manager [req-6d19d9af-d66b-4fd3-adad-d6249baa8cdf req-cfddc881-9c58-4544-a9ae-787eb152c766 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.591 182729 WARNING nova.compute.manager [req-6d19d9af-d66b-4fd3-adad-d6249baa8cdf req-cfddc881-9c58-4544-a9ae-787eb152c766 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state active and task_state None.
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.635 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.636 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.695 182729 DEBUG nova.network.neutron [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.714 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.735 182729 INFO nova.compute.manager [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Took 2.27 seconds to deallocate network for instance.
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.878 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.879 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5469MB free_disk=73.30503845214844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.879 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:28 compute-0 nova_compute[182725]: 2026-01-22 22:35:28.880 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:29 compute-0 nova_compute[182725]: 2026-01-22 22:35:29.053 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:30 compute-0 nova_compute[182725]: 2026-01-22 22:35:30.855 182729 DEBUG nova.compute.manager [req-0fb85c0e-a70d-4dd4-8f7d-3ca55fac5d43 req-d8a5f354-1ef0-4cf6-a852-9a380f227717 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Received event network-vif-deleted-afca4e40-9413-4bf5-91f1-2208d0fb0153 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.021 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.135 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.176 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 9dc942b5-8b65-4eb7-a57c-30d0a6221426 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.177 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance def087b6-47f2-4914-8d90-1e4426e4da0a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.178 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.178 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.189 182729 DEBUG nova.compute.manager [req-e98866f5-a076-450a-b65f-c3c4cd095338 req-90fca6b1-e3e8-4adf-8062-fe610bdb0bf1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.190 182729 DEBUG oslo_concurrency.lockutils [req-e98866f5-a076-450a-b65f-c3c4cd095338 req-90fca6b1-e3e8-4adf-8062-fe610bdb0bf1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.190 182729 DEBUG oslo_concurrency.lockutils [req-e98866f5-a076-450a-b65f-c3c4cd095338 req-90fca6b1-e3e8-4adf-8062-fe610bdb0bf1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.190 182729 DEBUG oslo_concurrency.lockutils [req-e98866f5-a076-450a-b65f-c3c4cd095338 req-90fca6b1-e3e8-4adf-8062-fe610bdb0bf1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.190 182729 DEBUG nova.compute.manager [req-e98866f5-a076-450a-b65f-c3c4cd095338 req-90fca6b1-e3e8-4adf-8062-fe610bdb0bf1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.191 182729 WARNING nova.compute.manager [req-e98866f5-a076-450a-b65f-c3c4cd095338 req-90fca6b1-e3e8-4adf-8062-fe610bdb0bf1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state active and task_state None.
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.263 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.287 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.321 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.321 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.322 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.372 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.385 182729 DEBUG nova.compute.provider_tree [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.401 182729 DEBUG nova.scheduler.client.report [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.422 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.465 182729 INFO nova.scheduler.client.report [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Deleted allocations for instance def087b6-47f2-4914-8d90-1e4426e4da0a
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.573 182729 DEBUG oslo_concurrency.lockutils [None req-120e158d-02e5-427f-bd27-ae32e6828a18 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "def087b6-47f2-4914-8d90-1e4426e4da0a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.757 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:31 compute-0 nova_compute[182725]: 2026-01-22 22:35:31.758 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:34 compute-0 nova_compute[182725]: 2026-01-22 22:35:34.056 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:34 compute-0 ovn_controller[94850]: 2026-01-22T22:35:34Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:7f:79 10.100.0.5
Jan 22 22:35:34 compute-0 nova_compute[182725]: 2026-01-22 22:35:34.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:35:35 compute-0 podman[226942]: 2026-01-22 22:35:35.139461959 +0000 UTC m=+0.059142900 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:35:35 compute-0 podman[226941]: 2026-01-22 22:35:35.139011218 +0000 UTC m=+0.061331626 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:36 compute-0 podman[226982]: 2026-01-22 22:35:36.179855677 +0000 UTC m=+0.095687480 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:35:36 compute-0 nova_compute[182725]: 2026-01-22 22:35:36.379 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.246 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.247 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.266 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.373 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.374 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.383 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.384 182729 INFO nova.compute.claims [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.394 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.395 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.431 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.543 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.603 182729 DEBUG nova.compute.provider_tree [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.622 182729 DEBUG nova.scheduler.client.report [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.648 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.649 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.652 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.660 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.661 182729 INFO nova.compute.claims [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.768 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.769 182729 DEBUG nova.network.neutron [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.820 182729 INFO nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:35:37 compute-0 nova_compute[182725]: 2026-01-22 22:35:37.846 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.009 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.011 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.011 182729 INFO nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Creating image(s)
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.012 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.013 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.014 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.031 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.062 182729 DEBUG nova.compute.provider_tree [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.091 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.092 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.093 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.109 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.138 182729 DEBUG nova.policy [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.143 182729 DEBUG nova.scheduler.client.report [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.174 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.175 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.196 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.196 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.234 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.235 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.235 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.263 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.264 182729 DEBUG nova.network.neutron [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.287 182729 INFO nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.295 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.296 182729 DEBUG nova.virt.disk.api [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.296 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.314 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.350 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.351 182729 DEBUG nova.virt.disk.api [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.351 182729 DEBUG nova.objects.instance [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.376 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.377 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Ensure instance console log exists: /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.378 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.378 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.378 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.449 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.451 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.452 182729 INFO nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Creating image(s)
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.453 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "/var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.454 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.455 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.482 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.541 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.543 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.544 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.558 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.619 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.620 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.659 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.660 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.661 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.720 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.721 182729 DEBUG nova.virt.disk.api [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Checking if we can resize image /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.722 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.789 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.791 182729 DEBUG nova.virt.disk.api [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Cannot resize image /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.791 182729 DEBUG nova.objects.instance [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'migration_context' on Instance uuid 49096ce9-494c-4c84-b263-86a05230d8af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.819 182729 DEBUG nova.policy [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.823 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.823 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Ensure instance console log exists: /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.824 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.824 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:38 compute-0 nova_compute[182725]: 2026-01-22 22:35:38.825 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:39 compute-0 nova_compute[182725]: 2026-01-22 22:35:39.060 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:39 compute-0 nova_compute[182725]: 2026-01-22 22:35:39.439 182729 DEBUG nova.network.neutron [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Successfully created port: eecbb79f-fdf2-48a6-828b-d9fc528ec81f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.453 182729 DEBUG nova.network.neutron [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Successfully created port: 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.551 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121325.5481315, def087b6-47f2-4914-8d90-1e4426e4da0a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.552 182729 INFO nova.compute.manager [-] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] VM Stopped (Lifecycle Event)
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.578 182729 DEBUG nova.compute.manager [None req-667f4e6c-c087-4248-92c4-ee7f5f7dcc60 - - - - - -] [instance: def087b6-47f2-4914-8d90-1e4426e4da0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.777 182729 DEBUG nova.network.neutron [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Successfully updated port: eecbb79f-fdf2-48a6-828b-d9fc528ec81f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.859 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.859 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:40 compute-0 nova_compute[182725]: 2026-01-22 22:35:40.860 182729 DEBUG nova.network.neutron [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:35:41 compute-0 nova_compute[182725]: 2026-01-22 22:35:41.106 182729 DEBUG nova.network.neutron [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:35:41 compute-0 nova_compute[182725]: 2026-01-22 22:35:41.384 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.209 182729 DEBUG nova.network.neutron [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Successfully updated port: 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.249 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.250 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquired lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.250 182729 DEBUG nova.network.neutron [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.521 182729 DEBUG nova.network.neutron [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.681 182729 DEBUG nova.compute.manager [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-changed-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.682 182729 DEBUG nova.compute.manager [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Refreshing instance network info cache due to event network-changed-eecbb79f-fdf2-48a6-828b-d9fc528ec81f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.682 182729 DEBUG oslo_concurrency.lockutils [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.981 182729 DEBUG nova.compute.manager [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received event network-changed-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.982 182729 DEBUG nova.compute.manager [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Refreshing instance network info cache due to event network-changed-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:35:42 compute-0 nova_compute[182725]: 2026-01-22 22:35:42.982 182729 DEBUG oslo_concurrency.lockutils [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.019 182729 DEBUG nova.network.neutron [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updating instance_info_cache with network_info: [{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.041 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.042 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance network_info: |[{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.042 182729 DEBUG oslo_concurrency.lockutils [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.042 182729 DEBUG nova.network.neutron [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Refreshing network info cache for port eecbb79f-fdf2-48a6-828b-d9fc528ec81f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.046 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Start _get_guest_xml network_info=[{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.052 182729 WARNING nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.057 182729 DEBUG nova.virt.libvirt.host [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.058 182729 DEBUG nova.virt.libvirt.host [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.065 182729 DEBUG nova.virt.libvirt.host [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.066 182729 DEBUG nova.virt.libvirt.host [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.068 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.068 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.069 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.069 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.069 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.069 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.070 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.070 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.070 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.071 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.071 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.071 182729 DEBUG nova.virt.hardware [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.076 182729 DEBUG nova.virt.libvirt.vif [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:35:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-826657528',display_name='tempest-ServerActionsTestJSON-server-826657528',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-826657528',id=111,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-3ujf49su',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=b924048a-36af-45c3-80fd-9400d5975e6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.077 182729 DEBUG nova.network.os_vif_util [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.077 182729 DEBUG nova.network.os_vif_util [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.079 182729 DEBUG nova.objects.instance [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.096 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <uuid>b924048a-36af-45c3-80fd-9400d5975e6a</uuid>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <name>instance-0000006f</name>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-826657528</nova:name>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:35:43</nova:creationTime>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         <nova:port uuid="eecbb79f-fdf2-48a6-828b-d9fc528ec81f">
Jan 22 22:35:43 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <system>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <entry name="serial">b924048a-36af-45c3-80fd-9400d5975e6a</entry>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <entry name="uuid">b924048a-36af-45c3-80fd-9400d5975e6a</entry>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </system>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <os>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   </os>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <features>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   </features>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.config"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:1f:1f:e7"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <target dev="tapeecbb79f-fd"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/console.log" append="off"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <video>
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </video>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:35:43 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:35:43 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:35:43 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:35:43 compute-0 nova_compute[182725]: </domain>
Jan 22 22:35:43 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.098 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Preparing to wait for external event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.098 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.098 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.099 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.099 182729 DEBUG nova.virt.libvirt.vif [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:35:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-826657528',display_name='tempest-ServerActionsTestJSON-server-826657528',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-826657528',id=111,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-3ujf49su',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=b924048a-36af-45c3-80fd-9400d5975e6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.100 182729 DEBUG nova.network.os_vif_util [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.100 182729 DEBUG nova.network.os_vif_util [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.101 182729 DEBUG os_vif [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.101 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.102 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.102 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.105 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeecbb79f-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.105 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeecbb79f-fd, col_values=(('external_ids', {'iface-id': 'eecbb79f-fdf2-48a6-828b-d9fc528ec81f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:1f:e7', 'vm-uuid': 'b924048a-36af-45c3-80fd-9400d5975e6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.108 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:43 compute-0 NetworkManager[54954]: <info>  [1769121343.1091] manager: (tapeecbb79f-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.110 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.116 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.117 182729 INFO os_vif [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd')
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.183 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.184 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.184 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] No VIF found with MAC fa:16:3e:1f:1f:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.185 182729 INFO nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Using config drive
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.704 182729 INFO nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Creating config drive at /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.config
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.714 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpceae8nd3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.847 182729 DEBUG oslo_concurrency.processutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpceae8nd3" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:43 compute-0 kernel: tapeecbb79f-fd: entered promiscuous mode
Jan 22 22:35:43 compute-0 NetworkManager[54954]: <info>  [1769121343.9229] manager: (tapeecbb79f-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Jan 22 22:35:43 compute-0 ovn_controller[94850]: 2026-01-22T22:35:43Z|00415|binding|INFO|Claiming lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f for this chassis.
Jan 22 22:35:43 compute-0 ovn_controller[94850]: 2026-01-22T22:35:43Z|00416|binding|INFO|eecbb79f-fdf2-48a6-828b-d9fc528ec81f: Claiming fa:16:3e:1f:1f:e7 10.100.0.4
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.926 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:43 compute-0 ovn_controller[94850]: 2026-01-22T22:35:43Z|00417|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f ovn-installed in OVS
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:43 compute-0 nova_compute[182725]: 2026-01-22 22:35:43.939 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:43 compute-0 ovn_controller[94850]: 2026-01-22T22:35:43Z|00418|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f up in Southbound
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.942 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:1f:e7 10.100.0.4'], port_security=['fa:16:3e:1f:1f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=eecbb79f-fdf2-48a6-828b-d9fc528ec81f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.944 104215 INFO neutron.agent.ovn.metadata.agent [-] Port eecbb79f-fdf2-48a6-828b-d9fc528ec81f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.946 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:43 compute-0 systemd-udevd[227053]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.959 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6fae9503-9fcc-4e55-97a5-213981821aef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.960 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.962 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.962 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5e86bfac-fe05-4c27-8e22-3eeeb012b62a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.963 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6c56d1-c1d5-42fa-b495-2ad66151ea08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:43 compute-0 NetworkManager[54954]: <info>  [1769121343.9698] device (tapeecbb79f-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:35:43 compute-0 NetworkManager[54954]: <info>  [1769121343.9708] device (tapeecbb79f-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.974 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[e2dc141d-137f-4f28-a0f9-2cbcce01229d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:43 compute-0 systemd-machined[154006]: New machine qemu-49-instance-0000006f.
Jan 22 22:35:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:43.997 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6d8daa05-a921-48ea-abbf-66d936d70a8b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:43 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-0000006f.
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.021 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[2b0f098d-3a1c-47a8-bf37-adeeef928314]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 NetworkManager[54954]: <info>  [1769121344.0281] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/200)
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.027 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a7da8352-b225-4956-9de6-4893e89bfc5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.059 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1d2b34c0-c2bf-4acb-a256-e0b0dc10f7b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.063 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.065 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[64ac6c15-2a51-43fb-96e5-947ca8459067]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 NetworkManager[54954]: <info>  [1769121344.0920] device (tape65877e5-00): carrier: link connected
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.098 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[31f97f92-c9b3-4dd2-aa4f-d30b551d84ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.118 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[51fe09ae-de74-435c-89b6-e1f79bfd8b68]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498169, 'reachable_time': 32305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227088, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.135 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b796d178-d310-41e6-bb79-1a5d5d0b8a09]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498169, 'tstamp': 498169}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227090, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.154 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[35b1b2b4-57f8-4a5a-8452-63acd796ff1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498169, 'reachable_time': 32305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227095, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.189 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[573db2c1-8f6c-4132-af26-0e6476488048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.225 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121344.2250004, b924048a-36af-45c3-80fd-9400d5975e6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.226 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Started (Lifecycle Event)
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.242 182729 DEBUG nova.network.neutron [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Updating instance_info_cache with network_info: [{"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.255 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.261 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121344.2251453, b924048a-36af-45c3-80fd-9400d5975e6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.261 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Paused (Lifecycle Event)
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.278 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5591e3aa-ccc9-405b-b635-cad5f7b7bc5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.280 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.280 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.281 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.283 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 NetworkManager[54954]: <info>  [1769121344.2838] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Jan 22 22:35:44 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.286 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.287 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 ovn_controller[94850]: 2026-01-22T22:35:44Z|00419|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.297 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.299 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.300 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[688376d7-9b63-4c5b-9173-d7f9f42064cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.301 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:35:44 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:44.303 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.454 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Releasing lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.455 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Instance network_info: |[{"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.456 182729 DEBUG oslo_concurrency.lockutils [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.456 182729 DEBUG nova.network.neutron [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Refreshing network info cache for port 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.462 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Start _get_guest_xml network_info=[{"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.470 182729 WARNING nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.477 182729 DEBUG nova.virt.libvirt.host [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.479 182729 DEBUG nova.virt.libvirt.host [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.487 182729 DEBUG nova.virt.libvirt.host [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.489 182729 DEBUG nova.virt.libvirt.host [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.491 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.491 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.492 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.493 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.493 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.493 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.494 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.494 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.494 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.495 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.495 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.495 182729 DEBUG nova.virt.hardware [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.503 182729 DEBUG nova.virt.libvirt.vif [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:35:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1722590006',display_name='tempest-₡-1722590006',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1722590006',id=112,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-5ci62lkw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:38Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=49096ce9-494c-4c84-b263-86a05230d8af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.503 182729 DEBUG nova.network.os_vif_util [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.504 182729 DEBUG nova.network.os_vif_util [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:1b:c8,bridge_name='br-int',has_traffic_filtering=True,id=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b7c9dcb-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.506 182729 DEBUG nova.objects.instance [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49096ce9-494c-4c84-b263-86a05230d8af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.515 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.520 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.562 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <uuid>49096ce9-494c-4c84-b263-86a05230d8af</uuid>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <name>instance-00000070</name>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <nova:name>tempest-₡-1722590006</nova:name>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:35:44</nova:creationTime>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:user uuid="10767689cb2d4ee383920e3d388a6dfe">tempest-ServersTestJSON-1676167595-project-member</nova:user>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:project uuid="25a5678696f747b3ac42324626646e40">tempest-ServersTestJSON-1676167595</nova:project>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         <nova:port uuid="9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4">
Jan 22 22:35:44 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <system>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <entry name="serial">49096ce9-494c-4c84-b263-86a05230d8af</entry>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <entry name="uuid">49096ce9-494c-4c84-b263-86a05230d8af</entry>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </system>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <os>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   </os>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <features>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   </features>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk.config"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:aa:1b:c8"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <target dev="tap9b7c9dcb-22"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/console.log" append="off"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <video>
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </video>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:35:44 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:35:44 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:35:44 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:35:44 compute-0 nova_compute[182725]: </domain>
Jan 22 22:35:44 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.562 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Preparing to wait for external event network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.563 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.563 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.563 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.564 182729 DEBUG nova.virt.libvirt.vif [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:35:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1722590006',display_name='tempest-₡-1722590006',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1722590006',id=112,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-5ci62lkw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:38Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=49096ce9-494c-4c84-b263-86a05230d8af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.564 182729 DEBUG nova.network.os_vif_util [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.565 182729 DEBUG nova.network.os_vif_util [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:1b:c8,bridge_name='br-int',has_traffic_filtering=True,id=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b7c9dcb-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.566 182729 DEBUG os_vif [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:1b:c8,bridge_name='br-int',has_traffic_filtering=True,id=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b7c9dcb-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.566 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.567 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.567 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.570 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.571 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b7c9dcb-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.571 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b7c9dcb-22, col_values=(('external_ids', {'iface-id': '9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:1b:c8', 'vm-uuid': '49096ce9-494c-4c84-b263-86a05230d8af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.573 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 NetworkManager[54954]: <info>  [1769121344.5746] manager: (tap9b7c9dcb-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.575 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.577 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.579 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.580 182729 INFO os_vif [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:1b:c8,bridge_name='br-int',has_traffic_filtering=True,id=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b7c9dcb-22')
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.668 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.669 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.669 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No VIF found with MAC fa:16:3e:aa:1b:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:35:44 compute-0 nova_compute[182725]: 2026-01-22 22:35:44.670 182729 INFO nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Using config drive
Jan 22 22:35:44 compute-0 podman[227131]: 2026-01-22 22:35:44.760393989 +0000 UTC m=+0.076737123 container create bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:35:44 compute-0 systemd[1]: Started libpod-conmon-bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004.scope.
Jan 22 22:35:44 compute-0 podman[227131]: 2026-01-22 22:35:44.72865399 +0000 UTC m=+0.044997144 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:35:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:35:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67e77da44cd34c45e600822f0c6abab867a47bc8facf833a4a5f4421ca048792/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:35:44 compute-0 podman[227131]: 2026-01-22 22:35:44.872612815 +0000 UTC m=+0.188955989 container init bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:35:44 compute-0 podman[227131]: 2026-01-22 22:35:44.880992516 +0000 UTC m=+0.197335680 container start bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:35:44 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [NOTICE]   (227151) : New worker (227153) forked
Jan 22 22:35:44 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [NOTICE]   (227151) : Loading success.
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.096 182729 DEBUG nova.compute.manager [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.097 182729 DEBUG oslo_concurrency.lockutils [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.097 182729 DEBUG oslo_concurrency.lockutils [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.098 182729 DEBUG oslo_concurrency.lockutils [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.098 182729 DEBUG nova.compute.manager [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Processing event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.098 182729 DEBUG nova.compute.manager [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.099 182729 DEBUG oslo_concurrency.lockutils [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.099 182729 DEBUG oslo_concurrency.lockutils [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.100 182729 DEBUG oslo_concurrency.lockutils [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.100 182729 DEBUG nova.compute.manager [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.101 182729 WARNING nova.compute.manager [req-2b83b86d-68b9-4de1-afa1-38557314a5cc req-ffdd4d44-ee4f-4f0f-8dfd-824c305c2454 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state building and task_state spawning.
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.102 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.107 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121345.1074436, b924048a-36af-45c3-80fd-9400d5975e6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.108 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Resumed (Lifecycle Event)
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.112 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.119 182729 INFO nova.virt.libvirt.driver [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance spawned successfully.
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.120 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.136 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.146 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.152 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.153 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.154 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.155 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.155 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.156 182729 DEBUG nova.virt.libvirt.driver [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.186 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.217 182729 INFO nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Creating config drive at /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk.config
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.226 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfi9ey6mh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.274 182729 INFO nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Took 7.26 seconds to spawn the instance on the hypervisor.
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.276 182729 DEBUG nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.386 182729 DEBUG oslo_concurrency.processutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfi9ey6mh" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:35:45 compute-0 kernel: tap9b7c9dcb-22: entered promiscuous mode
Jan 22 22:35:45 compute-0 NetworkManager[54954]: <info>  [1769121345.4720] manager: (tap9b7c9dcb-22): new Tun device (/org/freedesktop/NetworkManager/Devices/203)
Jan 22 22:35:45 compute-0 systemd-udevd[227080]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.475 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 ovn_controller[94850]: 2026-01-22T22:35:45Z|00420|binding|INFO|Claiming lport 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 for this chassis.
Jan 22 22:35:45 compute-0 ovn_controller[94850]: 2026-01-22T22:35:45Z|00421|binding|INFO|9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4: Claiming fa:16:3e:aa:1b:c8 10.100.0.14
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.477 182729 INFO nova.compute.manager [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Took 8.15 seconds to build instance.
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.488 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:1b:c8 10.100.0.14'], port_security=['fa:16:3e:aa:1b:c8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.490 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 bound to our chassis
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.493 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 22:35:45 compute-0 ovn_controller[94850]: 2026-01-22T22:35:45Z|00422|binding|INFO|Setting lport 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 ovn-installed in OVS
Jan 22 22:35:45 compute-0 ovn_controller[94850]: 2026-01-22T22:35:45Z|00423|binding|INFO|Setting lport 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 up in Southbound
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.495 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 NetworkManager[54954]: <info>  [1769121345.5020] device (tap9b7c9dcb-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:35:45 compute-0 NetworkManager[54954]: <info>  [1769121345.5028] device (tap9b7c9dcb-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.511 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.515 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e43d0bde-8b02-42c5-a0ce-5f4dbf5d4d19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.516 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8cfbdc2a-d1 in ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.519 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8cfbdc2a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.519 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae9aa79-a9d1-4e36-96ba-c48cba26d562]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.520 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[709407c4-22f0-4494-8e67-06e92dad5ffd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.532 182729 DEBUG nova.network.neutron [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updated VIF entry in instance network info cache for port eecbb79f-fdf2-48a6-828b-d9fc528ec81f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.533 182729 DEBUG nova.network.neutron [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updating instance_info_cache with network_info: [{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:45 compute-0 systemd-machined[154006]: New machine qemu-50-instance-00000070.
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.535 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a7203fc3-1372-49fe-83ec-8558ffa51eab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-00000070.
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.567 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3c59d533-8c7a-4615-9471-90c7c13daba9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.583 182729 DEBUG oslo_concurrency.lockutils [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.585 182729 DEBUG oslo_concurrency.lockutils [None req-0e0cc210-1a2b-4b03-a9ea-e11c90617943 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.606 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9e6e5483-c9ca-4d6b-8788-6218fdd6e0d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 NetworkManager[54954]: <info>  [1769121345.6178] manager: (tap8cfbdc2a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/204)
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.619 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0c3805-3729-4c84-a22a-3d8073302495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.664 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[23ee2cf9-da5f-4a10-9713-2cd49ed5d71a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.672 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e232ed39-93f5-441d-9cb0-cde77417b393]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 NetworkManager[54954]: <info>  [1769121345.7019] device (tap8cfbdc2a-d0): carrier: link connected
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.711 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c1fb11c7-ddd6-4b95-afdb-50aff1197bce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.740 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc716be-6ba5-40aa-b091-4a13837bec23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498330, 'reachable_time': 23825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227198, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.764 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b5587a0b-a3c1-45ad-8660-c9c0b3f9f836]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:b87b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498330, 'tstamp': 498330}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227199, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.782 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4cae98-d427-4ccd-8fe7-f3b93a7bd274]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498330, 'reachable_time': 23825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227200, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.826 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9f358f9a-1612-40f3-b73e-8887778cfcd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.898 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7b666635-1e04-48f2-9dce-163e34e56ca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.900 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.900 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.901 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cfbdc2a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.902 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 kernel: tap8cfbdc2a-d0: entered promiscuous mode
Jan 22 22:35:45 compute-0 NetworkManager[54954]: <info>  [1769121345.9041] manager: (tap8cfbdc2a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.905 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.908 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cfbdc2a-d0, col_values=(('external_ids', {'iface-id': '48ec79a0-32c2-47c3-bffb-8836aa917258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.909 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 ovn_controller[94850]: 2026-01-22T22:35:45Z|00424|binding|INFO|Releasing lport 48ec79a0-32c2-47c3-bffb-8836aa917258 from this chassis (sb_readonly=0)
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.910 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.912 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:35:45 compute-0 nova_compute[182725]: 2026-01-22 22:35:45.920 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.920 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[345c66ba-5d3a-442c-a0e6-40054b953592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.925 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.pid.haproxy
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:35:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:35:45.926 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'env', 'PROCESS_TAG=haproxy-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.012 182729 DEBUG nova.compute.manager [req-a6d9baae-a374-461e-83cd-20dcf89bed57 req-226c8285-e47c-4556-972e-b5b5c8a152ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received event network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.012 182729 DEBUG oslo_concurrency.lockutils [req-a6d9baae-a374-461e-83cd-20dcf89bed57 req-226c8285-e47c-4556-972e-b5b5c8a152ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.012 182729 DEBUG oslo_concurrency.lockutils [req-a6d9baae-a374-461e-83cd-20dcf89bed57 req-226c8285-e47c-4556-972e-b5b5c8a152ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.013 182729 DEBUG oslo_concurrency.lockutils [req-a6d9baae-a374-461e-83cd-20dcf89bed57 req-226c8285-e47c-4556-972e-b5b5c8a152ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.013 182729 DEBUG nova.compute.manager [req-a6d9baae-a374-461e-83cd-20dcf89bed57 req-226c8285-e47c-4556-972e-b5b5c8a152ff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Processing event network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.063 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121346.0632522, 49096ce9-494c-4c84-b263-86a05230d8af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.064 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] VM Started (Lifecycle Event)
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.066 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.070 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.078 182729 INFO nova.virt.libvirt.driver [-] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Instance spawned successfully.
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.078 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.088 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.092 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.101 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.101 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.102 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.102 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.102 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.102 182729 DEBUG nova.virt.libvirt.driver [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.128 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.128 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121346.063541, 49096ce9-494c-4c84-b263-86a05230d8af => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.129 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] VM Paused (Lifecycle Event)
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.154 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.158 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121346.0696766, 49096ce9-494c-4c84-b263-86a05230d8af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.158 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] VM Resumed (Lifecycle Event)
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.185 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.199 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.245 182729 INFO nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Took 7.80 seconds to spawn the instance on the hypervisor.
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.245 182729 DEBUG nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.249 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.372 182729 INFO nova.compute.manager [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Took 8.87 seconds to build instance.
Jan 22 22:35:46 compute-0 podman[227239]: 2026-01-22 22:35:46.383474839 +0000 UTC m=+0.082241332 container create 7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:35:46 compute-0 nova_compute[182725]: 2026-01-22 22:35:46.394 182729 DEBUG oslo_concurrency.lockutils [None req-2bf5e9c0-fb9f-404d-affc-0b734747dbe7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:46 compute-0 podman[227239]: 2026-01-22 22:35:46.328638848 +0000 UTC m=+0.027405391 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:35:46 compute-0 systemd[1]: Started libpod-conmon-7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e.scope.
Jan 22 22:35:46 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:35:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78185cb462a48e133227c168e3ae057c635ad4861d3609b2f0033639219b978c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:35:46 compute-0 podman[227239]: 2026-01-22 22:35:46.480254466 +0000 UTC m=+0.179020939 container init 7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 22:35:46 compute-0 podman[227239]: 2026-01-22 22:35:46.48754414 +0000 UTC m=+0.186310593 container start 7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:35:46 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [NOTICE]   (227258) : New worker (227260) forked
Jan 22 22:35:46 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [NOTICE]   (227258) : Loading success.
Jan 22 22:35:47 compute-0 nova_compute[182725]: 2026-01-22 22:35:47.006 182729 DEBUG nova.network.neutron [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Updated VIF entry in instance network info cache for port 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:35:47 compute-0 nova_compute[182725]: 2026-01-22 22:35:47.007 182729 DEBUG nova.network.neutron [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Updating instance_info_cache with network_info: [{"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:47 compute-0 nova_compute[182725]: 2026-01-22 22:35:47.041 182729 DEBUG oslo_concurrency.lockutils [req-1bfa4c92-6545-4ee5-b45c-952ec951accf req-1e70f643-a56a-4f88-9cba-b588aee28b00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:48 compute-0 nova_compute[182725]: 2026-01-22 22:35:48.151 182729 DEBUG nova.compute.manager [req-c4081efb-73e2-4f12-aa05-8124b7db4382 req-a978ae25-9364-442f-b9cd-78f56c674164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received event network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:48 compute-0 nova_compute[182725]: 2026-01-22 22:35:48.152 182729 DEBUG oslo_concurrency.lockutils [req-c4081efb-73e2-4f12-aa05-8124b7db4382 req-a978ae25-9364-442f-b9cd-78f56c674164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:35:48 compute-0 nova_compute[182725]: 2026-01-22 22:35:48.152 182729 DEBUG oslo_concurrency.lockutils [req-c4081efb-73e2-4f12-aa05-8124b7db4382 req-a978ae25-9364-442f-b9cd-78f56c674164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:35:48 compute-0 nova_compute[182725]: 2026-01-22 22:35:48.153 182729 DEBUG oslo_concurrency.lockutils [req-c4081efb-73e2-4f12-aa05-8124b7db4382 req-a978ae25-9364-442f-b9cd-78f56c674164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:35:48 compute-0 nova_compute[182725]: 2026-01-22 22:35:48.153 182729 DEBUG nova.compute.manager [req-c4081efb-73e2-4f12-aa05-8124b7db4382 req-a978ae25-9364-442f-b9cd-78f56c674164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] No waiting events found dispatching network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:35:48 compute-0 nova_compute[182725]: 2026-01-22 22:35:48.153 182729 WARNING nova.compute.manager [req-c4081efb-73e2-4f12-aa05-8124b7db4382 req-a978ae25-9364-442f-b9cd-78f56c674164 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received unexpected event network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 for instance with vm_state active and task_state None.
Jan 22 22:35:49 compute-0 nova_compute[182725]: 2026-01-22 22:35:49.069 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:49 compute-0 nova_compute[182725]: 2026-01-22 22:35:49.573 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:50 compute-0 podman[227269]: 2026-01-22 22:35:50.151955011 +0000 UTC m=+0.087856243 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 22:35:52 compute-0 nova_compute[182725]: 2026-01-22 22:35:52.184 182729 DEBUG nova.compute.manager [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-changed-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:35:52 compute-0 nova_compute[182725]: 2026-01-22 22:35:52.184 182729 DEBUG nova.compute.manager [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Refreshing instance network info cache due to event network-changed-eecbb79f-fdf2-48a6-828b-d9fc528ec81f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:35:52 compute-0 nova_compute[182725]: 2026-01-22 22:35:52.185 182729 DEBUG oslo_concurrency.lockutils [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:35:52 compute-0 nova_compute[182725]: 2026-01-22 22:35:52.185 182729 DEBUG oslo_concurrency.lockutils [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:35:52 compute-0 nova_compute[182725]: 2026-01-22 22:35:52.185 182729 DEBUG nova.network.neutron [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Refreshing network info cache for port eecbb79f-fdf2-48a6-828b-d9fc528ec81f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:35:54 compute-0 nova_compute[182725]: 2026-01-22 22:35:54.070 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:54 compute-0 nova_compute[182725]: 2026-01-22 22:35:54.575 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:55 compute-0 nova_compute[182725]: 2026-01-22 22:35:55.046 182729 DEBUG nova.network.neutron [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updated VIF entry in instance network info cache for port eecbb79f-fdf2-48a6-828b-d9fc528ec81f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:35:55 compute-0 nova_compute[182725]: 2026-01-22 22:35:55.047 182729 DEBUG nova.network.neutron [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updating instance_info_cache with network_info: [{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:35:55 compute-0 nova_compute[182725]: 2026-01-22 22:35:55.067 182729 DEBUG oslo_concurrency.lockutils [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:35:56 compute-0 podman[227291]: 2026-01-22 22:35:56.182332029 +0000 UTC m=+0.097713291 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41)
Jan 22 22:35:56 compute-0 podman[227290]: 2026-01-22 22:35:56.182848032 +0000 UTC m=+0.112415272 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:35:58 compute-0 ovn_controller[94850]: 2026-01-22T22:35:58Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:1f:e7 10.100.0.4
Jan 22 22:35:58 compute-0 ovn_controller[94850]: 2026-01-22T22:35:58Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:1f:e7 10.100.0.4
Jan 22 22:35:59 compute-0 nova_compute[182725]: 2026-01-22 22:35:59.071 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:59 compute-0 nova_compute[182725]: 2026-01-22 22:35:59.577 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:35:59 compute-0 ovn_controller[94850]: 2026-01-22T22:35:59Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:aa:1b:c8 10.100.0.14
Jan 22 22:35:59 compute-0 ovn_controller[94850]: 2026-01-22T22:35:59Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:aa:1b:c8 10.100.0.14
Jan 22 22:36:04 compute-0 nova_compute[182725]: 2026-01-22 22:36:04.074 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:04 compute-0 nova_compute[182725]: 2026-01-22 22:36:04.580 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:05 compute-0 nova_compute[182725]: 2026-01-22 22:36:05.174 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:05.173 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:36:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:05.178 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:36:06 compute-0 podman[227373]: 2026-01-22 22:36:06.168520714 +0000 UTC m=+0.084213642 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 22:36:06 compute-0 podman[227374]: 2026-01-22 22:36:06.176994395 +0000 UTC m=+0.090399926 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:36:06 compute-0 podman[227417]: 2026-01-22 22:36:06.310680002 +0000 UTC m=+0.089815263 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:36:06 compute-0 nova_compute[182725]: 2026-01-22 22:36:06.843 182729 DEBUG oslo_concurrency.lockutils [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:06 compute-0 nova_compute[182725]: 2026-01-22 22:36:06.844 182729 DEBUG oslo_concurrency.lockutils [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:06 compute-0 nova_compute[182725]: 2026-01-22 22:36:06.844 182729 DEBUG nova.compute.manager [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:06 compute-0 nova_compute[182725]: 2026-01-22 22:36:06.849 182729 DEBUG nova.compute.manager [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 22:36:06 compute-0 nova_compute[182725]: 2026-01-22 22:36:06.850 182729 DEBUG nova.objects.instance [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'flavor' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:06 compute-0 nova_compute[182725]: 2026-01-22 22:36:06.874 182729 DEBUG nova.objects.instance [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'info_cache' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:06 compute-0 nova_compute[182725]: 2026-01-22 22:36:06.910 182729 DEBUG nova.virt.libvirt.driver [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:36:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:08.181 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.079 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.111 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '49096ce9-494c-4c84-b263-86a05230d8af', 'name': 'tempest-₡-1722590006', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000070', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '25a5678696f747b3ac42324626646e40', 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'hostId': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.113 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'name': 'tempest-ServerActionsTestJSON-server-826657528', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '301c97a097c64afd8d55adb73fdd8cce', 'user_id': '97ae504d8c4f43529c360266766791d0', 'hostId': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.115 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9b1f07a8546648baba916fffc53a0b93', 'user_id': '9d1e26d3056148e692e157703469d77a', 'hostId': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.118 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 49096ce9-494c-4c84-b263-86a05230d8af / tap9b7c9dcb-22 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.118 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.121 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b924048a-36af-45c3-80fd-9400d5975e6a / tapeecbb79f-fd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:36:09 compute-0 kernel: tapeecbb79f-fd (unregistering): left promiscuous mode
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.121 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.125 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9dc942b5-8b65-4eb7-a57c-30d0a6221426 / tap03c52ba6-92 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.125 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 NetworkManager[54954]: <info>  [1769121369.1269] device (tapeecbb79f-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '302a051d-fea3-4a19-a8fc-0208acced5b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.115318', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf36beec-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': '67834f5d0ff03d3749909e2ee67aaf918a218df67487c9b82ed122caf6311d64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.115318', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf373084-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': 'd519b64d68e1f204564358b3bdb910b3ad9a44eca08c70c04c30a1575fdc9bbe'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.115318', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf37d19c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': 'f260812334baf36ac9c798f9688e7c1d44c434ab2c8d35d1862cabd7a84e5840'}]}, 'timestamp': '2026-01-22 22:36:09.126294', '_unique_id': '687a2a01c4c2426fbd00c97f9f35f7db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.127 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.129 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.129 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.129 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.incoming.bytes volume: 987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc00c72e-e858-4baf-9e75-924f0d01ee66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.129091', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf384c4e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': '24c41c8fc9de81221e6f1d7b150a8aa7f73f5b3ade0437935cd3c66c014ecc1a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.129091', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf3857f2-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': '4a7af80c1110143b68521d924d480c7228c0f6ccc9770e99fa3f6812bbc8a25f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 987, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.129091', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf386440-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': 'fbeae61df828d6dab9e2bf4f4abcca2255a2e81b51640999ef3981ac0ee7ec87'}]}, 'timestamp': '2026-01-22 22:36:09.130043', '_unique_id': 'd2394884db50440d86cb6996d3422a9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.130 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.131 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.131 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>]
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.132 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.141 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:09 compute-0 ovn_controller[94850]: 2026-01-22T22:36:09Z|00425|binding|INFO|Releasing lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f from this chassis (sb_readonly=0)
Jan 22 22:36:09 compute-0 ovn_controller[94850]: 2026-01-22T22:36:09Z|00426|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f down in Southbound
Jan 22 22:36:09 compute-0 ovn_controller[94850]: 2026-01-22T22:36:09Z|00427|binding|INFO|Removing iface tapeecbb79f-fd ovn-installed in OVS
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.146 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.156 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.158 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:1f:e7 10.100.0.4'], port_security=['fa:16:3e:1f:1f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=eecbb79f-fdf2-48a6-828b-d9fc528ec81f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.159 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/memory.usage volume: 42.765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.159 104215 INFO neutron.agent.ovn.metadata.agent [-] Port eecbb79f-fdf2-48a6-828b-d9fc528ec81f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.163 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.165 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[acc391b0-ca93-44f6-9ae0-b07769a05391]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.165 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:36:09 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 22 22:36:09 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006f.scope: Consumed 12.997s CPU time.
Jan 22 22:36:09 compute-0 systemd-machined[154006]: Machine qemu-49-instance-0000006f terminated.
Jan 22 22:36:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [NOTICE]   (227151) : haproxy version is 2.8.14-c23fe91
Jan 22 22:36:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [NOTICE]   (227151) : path to executable is /usr/sbin/haproxy
Jan 22 22:36:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [WARNING]  (227151) : Exiting Master process...
Jan 22 22:36:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [WARNING]  (227151) : Exiting Master process...
Jan 22 22:36:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [ALERT]    (227151) : Current worker (227153) exited with code 143 (Terminated)
Jan 22 22:36:09 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227147]: [WARNING]  (227151) : All workers exited. Exiting... (0)
Jan 22 22:36:09 compute-0 systemd[1]: libpod-bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004.scope: Deactivated successfully.
Jan 22 22:36:09 compute-0 podman[227466]: 2026-01-22 22:36:09.301742453 +0000 UTC m=+0.049967168 container died bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:36:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004-userdata-shm.mount: Deactivated successfully.
Jan 22 22:36:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-67e77da44cd34c45e600822f0c6abab867a47bc8facf833a4a5f4421ca048792-merged.mount: Deactivated successfully.
Jan 22 22:36:09 compute-0 podman[227466]: 2026-01-22 22:36:09.354990332 +0000 UTC m=+0.103215097 container cleanup bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:36:09 compute-0 systemd[1]: libpod-conmon-bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004.scope: Deactivated successfully.
Jan 22 22:36:09 compute-0 podman[227496]: 2026-01-22 22:36:09.424117708 +0000 UTC m=+0.043719212 container remove bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.425 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.431 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7591b3e1-98f6-4636-ad4d-1f2ae10ffa38]: (4, ('Thu Jan 22 10:36:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004)\nbf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004\nThu Jan 22 10:36:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (bf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004)\nbf24c830e388e9a5c0d5e60a946b36e942d6adc0dbb01f8104894752b385d004\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.433 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e03fcaf8-c9d7-4fa8-9177-61d0860b73c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.437 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.439 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:09 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.446 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/memory.usage volume: 42.26171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fa104de-ad13-4d10-8363-cf385e3e13fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.765625, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'timestamp': '2026-01-22T22:36:09.132230', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'bf3cea4c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.816374853, 'message_signature': '0e0c83a9811285f829503b1976aff92e4f246277770a4fff3e836935243e9edc'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.26171875, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'timestamp': '2026-01-22T22:36:09.132230', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'bf68d5a8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.103700805, 'message_signature': '93d7e5bd3ce3d1fc24b4ef3d7e043c4ffc3a5d36a55bd9ba61c4fec4ae16bc0a'}]}, 'timestamp': '2026-01-22 22:36:09.447600', '_unique_id': 'eba2b7cf6287493da963bae019f08821'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.456 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.460 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e058dd16-bcaf-4d73-a0f6-4da867dfd361]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.464 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.464 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.466 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.474 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a35659f0-7bc2-4abb-be8b-4c4314f4ac56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.474 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d26e73ce-c96f-408f-8781-84feb8b0a878]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.477 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.477 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c1cf805-3eed-42a3-8e4c-9133bef35c28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.450636', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf6b7010-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.10791322, 'message_signature': 'ea89d6a18a4391c7cb4baf348cfecc457fcefdfc9d049fe52ed2a4cd7302fc45'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.450636', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf6b7fa6-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.10791322, 'message_signature': '42f3692ed758a70a2d1f2597c058966f24cc71a29a36e71951c54c9fdea66024'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.450636', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf6d699c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.124025942, 'message_signature': '6386bf0806a4b7988f93cee0b3fa5c4666c7b7abc24473163a5eac973ff08f1e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.450636', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf6d746e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.124025942, 'message_signature': 'd1a929df489b2608530392e8cd01d30867744b0b0938ad44fb1adf8d871037aa'}]}, 'timestamp': '2026-01-22 22:36:09.477759', '_unique_id': 'a28d9b71d2ca42329ca37fa0f9e329dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.478 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.479 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.479 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.480 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.outgoing.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.480 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c81ab77-4bbc-4ddf-ba20-db9c823a5ebb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.479751', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf6dcd10-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': 'ccd42f6ff3ab4e726672bf40d59c394681119d59a1521c8c557a83f15465312b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.479751', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf6dd5da-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': '790b958bf6a1625bdfac1b13956585f01bd76f68c215a8c8d88302da8a2bda67'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.479751', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf6dddf0-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': '336b4a2207b4d0b7bbc40baf9bbb08730f2612daabae9666eba435316e90ffe1'}]}, 'timestamp': '2026-01-22 22:36:09.480446', '_unique_id': 'f1c581d9052c4a479a626d996e84950b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.481 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.482 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>]
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.482 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.482 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.482 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.outgoing.bytes volume: 3320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.482 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.outgoing.bytes volume: 1278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d5552a6-6f71-47f5-85aa-e70f0e833fc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.482298', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf6e3106-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': '7ede397416dfe4098f3fd1a318a3522eb95111b9cd7adb2319d7f6baee4d7777'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3320, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.482298', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf6e3c82-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': '24553fb4404438702ea3848b5e79b1eb5e905db3c591351e8f45489f88ccda85'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1278, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.482298', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf6e4754-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': '14162369297fd01080c8fe643d414bc3683ea56d131dd3fe1fecb67f694bfdbe'}]}, 'timestamp': '2026-01-22 22:36:09.483148', '_unique_id': 'e0ac5e2a525c4502a1d59bbd86bd5d98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.484 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.491 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[36ae9d07-84a1-444c-9863-3553ffed8306]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498162, 'reachable_time': 41570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227530, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.497 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:36:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:09.498 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc1e691-f327-4677-a176-dba566730f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.511 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.write.bytes volume: 72835072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.511 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.512 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.544 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.write.bytes volume: 307200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.545 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '241bb497-0bb3-4b68-8406-ea4eef01262a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72835072, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.484845', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf72a8e4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': 'aa2050f29ce3549fe99bdfd1322951c59250a4a5f6893b842286f62d70a6a167'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.484845', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf72b4c4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': 'd14fb00616e06697c0bc6ef9f706cc9a38542d880f5667f1706547366be86766'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 307200, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.484845', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf77bd5c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': 'abd9e1f89498922b55189dc001b03913728974f7c5113a667d58dd98cbc72e00'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.484845', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf77ccde-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': 'a130427edcab12920f9c3c421f9fde2afe1bd687e443980d1bbda376a5324160'}]}, 'timestamp': '2026-01-22 22:36:09.545560', '_unique_id': '153e983b3ecb4d82b2be180b9152c1c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.547 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.547 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.547 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.548 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.548 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.write.requests volume: 36 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c2f534f-2498-4ed5-a39e-9478505bcc96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.547491', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7821de-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': '89ce4dc68551a5ee6f3c5c772b787f5ab5d06c01aace656e908f832741707603'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.547491', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf782a9e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': 'bf0d938beab035dabf55899007a44a189cc5adc2f63a3640cf5b8e337cc80fec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 36, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.547491', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf785942-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': 'bf5334349ddbc3fa45e38a85dc019e47c08e8e3701b9595863d84a74be695d19'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.547491', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf786496-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '615dea4b518555c6de2a41f283f94f914c33348defcac09f7fb3dd69a126ef73'}]}, 'timestamp': '2026-01-22 22:36:09.549429', '_unique_id': '1a326e0e7c8d4a4ba6c307a956c8b336'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.550 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.550 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93187797-d1ed-4828-a6b2-4117e7e960e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.550707', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf78a03c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': '3b1d3613d0ecfe7cc3c1ab7d3d97cb9dcb2e148d9290c2abb8b927cefb7f0716'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.550707', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf78a8b6-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': '95ff1c94bee2eb4fc3e65917a18439198af2a4d142e8b7ef58ac9b26ef36d86f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.550707', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf78b09a-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': 'de42a8838d63538533f6e9bbcdc75ee982f55cffb051daac12fa118e0c14e247'}]}, 'timestamp': '2026-01-22 22:36:09.551371', '_unique_id': '19c80bcfc1914a6bb41b764152175706'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.551 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.552 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.552 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.553 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.554 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.554 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.554 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b87d631-fb81-4b40-afd2-79cbc36779fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.552825', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf78f30c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.10791322, 'message_signature': '2326458d92d6318bd8792b8aabcdff7d54d307d060b4d8cef75a8874e9b178b4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.552825', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf78ff00-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.10791322, 'message_signature': 'c73a7eefc85dbd6e156074585cc6c52abbd566484c5f850c813f51f79b976c11'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.552825', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf792f84-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.124025942, 'message_signature': '718a85521616ed08db2002da0b981fa0dd8ffa0b583ede4bd244008206c8715d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.552825', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf793aa6-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.124025942, 'message_signature': '0cea5030e3a4516ce770bb8f6e19c59e28afc02295f2a62fbd9c99a85f03491a'}]}, 'timestamp': '2026-01-22 22:36:09.554918', '_unique_id': '5d9ec70e417841a5b73569cd73f82b19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.555 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.556 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.556 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.556 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.556 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '704eb59e-2290-46a1-845d-557647b70101', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.556154', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf797494-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': 'b604dd9940ed46cb75d647b9fbb8c512d80d17bf9b58a91d87025cdad13f8f3b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.556154', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf797d0e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': '36abdedf35f19921ab50db5defe705d46a171e3c421228ff2314e71498f2d0a2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.556154', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf7984fc-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': 'd5e9989eb1b9abe6d491ba04889de2e82eb88c32a342722d8d1cebdefb2f4678'}]}, 'timestamp': '2026-01-22 22:36:09.556824', '_unique_id': '50ee0fc4e0b24330b59a1af1af180b3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.557 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.read.bytes volume: 31013376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.558 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.558 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.read.bytes volume: 32040960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c25cec5-3236-4dea-b891-b6873b1d4519', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31013376, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.557959', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf79ba44-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': '5a6914a64263237e20924751d2598a5ae5b01f12527975c6d1fa38b93fbc70c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.557959', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf79c1ba-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': 'eec4dd40d1348bf34506e314dc32bc761165a476d84783e8b31258c110f34127'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32040960, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.557959', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf79e604-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '5909ead68f630eae4406a8293b7b7c38a843d4b8c9d3cab2808ce13e072535a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.557959', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf79edca-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': 'dd6a6f859b8ed52a9b8896b888b0fe4b458c3949dff6e0a1275ddb12838b6def'}]}, 'timestamp': '2026-01-22 22:36:09.559483', '_unique_id': '39d4f761c4ce41ac98120e277e00d53a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.560 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.560 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.560 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.561 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.561 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5c3f8bc-c0e2-426a-9be5-dbdbd4c6b16c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.560607', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7a21b4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.10791322, 'message_signature': '38e9815cf4e0a122bb5ccb1a03c622ddd09d732d780dd1f53d8c159d8454eb70'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.560607', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7a2a2e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.10791322, 'message_signature': '05e833ee438eb43684edc0ef2f3ebecf4010ef2961dfbb8320267d29a29eedf6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.560607', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7a560c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.124025942, 'message_signature': '461d1d5caa6848fa8c6231461086443aa5a3db27790c480c8c215e60f65d2d90'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.560607', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7a5db4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.124025942, 'message_signature': '67ed741a377654ed016d6a8e9869f5edefb4351de06716850f7d4e597531600b'}]}, 'timestamp': '2026-01-22 22:36:09.562350', '_unique_id': 'c1705fcc7e3c46668ce1004cb99bfa3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.563 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.563 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.write.latency volume: 2652151320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.563 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.564 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.564 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.write.latency volume: 41087153 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.564 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4f61865-be2b-4906-82ff-a5c57b61c6fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2652151320, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.563472', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7a91a8-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': 'f46da0abccbaf533610cd306faf28ba37545765574cbb3cf5dbfc9813f3e00cc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.563472', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7a991e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': '327aa91a6015c00f533b1bd6e500c44339de736cb0f7d9780448c03ce30267b0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41087153, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.563472', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7ab976-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '0df9f1942e27c0e65e28ca0561ead209d63c0ce23047a4267e79c4380c095c0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.563472', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7ac18c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '59c28c6a97600cfbc154c7406f259b5c132901e305749b212fa1efc31b6eda34'}]}, 'timestamp': '2026-01-22 22:36:09.564905', '_unique_id': 'a49cd8725ce54998bbad27157203d1f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.565 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.566 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.566 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.read.latency volume: 219993296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.566 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.read.latency volume: 20319243 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.566 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.read.latency volume: 219338368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.read.latency volume: 19509953 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7412a159-ea75-415c-b37d-c9f4ef955906', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 219993296, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.566085', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7af7a6-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': '8b3eda1a8feac59bb3c1323137adeb739509c6d9be6d21b03d77cf60fdf28544'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20319243, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.566085', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7aff62-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': '2d25f1f392c24926390e0912d397bf218d31f5f9e67987266e94089785fcd180'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 219338368, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.566085', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7b1ccc-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '5bf59b53b8e4557fa1435dd46d5b6555bcea97c0bab90703e3e52c808f5902fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19509953, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.566085', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7b2456-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '057fe50ff4992233489a4b7497abc5ebf50041d77c2b22c877467b7430bc62ee'}]}, 'timestamp': '2026-01-22 22:36:09.567435', '_unique_id': '4eac42d6a52741268262df2f3c764302'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.567 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.568 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.568 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/cpu volume: 11470000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.569 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.569 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/cpu volume: 11260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '884295db-52d3-4884-a24f-b04d25fc7a6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11470000000, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'timestamp': '2026-01-22T22:36:09.568646', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bf7b5be2-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.816374853, 'message_signature': '0828368b5db0e1c0885c849e06e2266ed6a18cb83ee018d351c600ff1416f91b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11260000000, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'timestamp': '2026-01-22T22:36:09.568646', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bf7b7f0a-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.103700805, 'message_signature': '8f813451d7e65dfb1c8c65e3ccdcec763a1b53eb822708f4c911125c9d208db5'}]}, 'timestamp': '2026-01-22 22:36:09.569763', '_unique_id': '59cbb15241de4c78b5ca7f83d7afddda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.570 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.571 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.571 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37bf0db8-fa8b-47a5-bca8-83f321ae10bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.570878', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf7bb330-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': 'e3f17a6d75e535324cbd13491448da1ae7329c5e97ed52098111759d18ecdac6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.570878', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf7bbb32-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': 'cbfa986c5a894ff6299b699158d72e2a3afc79a35763b1462bcdb7b6f1141557'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.570878', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf7bc30c-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': 'e2c5c7b2d315735b889adc4b741530670d6fa1da1bd907bbe3c45b7162fb947c'}]}, 'timestamp': '2026-01-22 22:36:09.571505', '_unique_id': '7288ae5873df4f3caf002f9957f59024'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.read.requests volume: 1134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.572 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.573 12 DEBUG ceilometer.compute.pollsters [-] Instance b924048a-36af-45c3-80fd-9400d5975e6a was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000006f, id=b924048a-36af-45c3-80fd-9400d5975e6a>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.573 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.read.requests volume: 1210 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d3f7b26-20aa-4c6a-8a1f-bd60e507d1d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1134, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-vda', 'timestamp': '2026-01-22T22:36:09.572632', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7bf778-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': 'd18b4bcf695e0e4c1d82b70c468feecc0794fca44e1953b72ed145fe379467d7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': '49096ce9-494c-4c84-b263-86a05230d8af-sda', 'timestamp': '2026-01-22T22:36:09.572632', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'instance-00000070', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7bffde-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.142128914, 'message_signature': '5e92a82046714968869f2cf4d68e13da111e732895a0b666326673b3b76bc16b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1210, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-vda', 'timestamp': '2026-01-22T22:36:09.572632', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bf7c2aa4-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '4528520e1960446d897222da4da360d4aa16aa4fe23a561365537b7202163fb1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426-sda', 'timestamp': '2026-01-22T22:36:09.572632', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'instance-0000006c', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bf7c362a-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5007.170230906, 'message_signature': '16062188ba0c89fe485b0503a2b8012d1478d4290782361347950d00c31b8e72'}]}, 'timestamp': '2026-01-22 22:36:09.574450', '_unique_id': '4bbffd32fd074f71b7a234acf17b1857'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.574 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.575 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.575 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0db3045-324e-408e-b53f-458941b719d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.575694', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf7c7248-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': '6d6d5850fa9e61fd027edbdc858663cae673be0ef5ec9e65e65ab64322c87b73'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.575694', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf7c7ac2-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': 'c300119e5cf31bd59745a0b3ec768fb85f909f5f7f504ad01c7f35d41155c9f0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.575694', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf7c82b0-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': '93cfc35b83f4a79dbb916cb2e76ee501dc87b59561fe1448083cc358a30e9917'}]}, 'timestamp': '2026-01-22 22:36:09.576407', '_unique_id': '2f3a4effa95b435493f37ec65171a2a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.576 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.577 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.577 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>]
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.578 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.578 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.578 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.578 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea5cc47e-9238-4224-8024-4c5652b934ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.578103', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf7ccd56-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': '92d6988feb1a282c70b9d715d9c5a8b5b98e622110ac39e95202c549e104f5ca'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.578103', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf7cd5bc-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': 'a50295668ffa237f23f397bc3b22aa4c850ce6fb6b92be3e1496f2e5da6d3049'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.578103', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf7cdecc-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': '6d122e3cd6f1c702810ee5fdebb21a954ae08a5f740c6fdacf3c75264517fa01'}]}, 'timestamp': '2026-01-22 22:36:09.578772', '_unique_id': '41d6ddbe9dcc482aa527baf8792a37e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.579 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.580 12 DEBUG ceilometer.compute.pollsters [-] 49096ce9-494c-4c84-b263-86a05230d8af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.580 12 DEBUG ceilometer.compute.pollsters [-] b924048a-36af-45c3-80fd-9400d5975e6a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.580 12 DEBUG ceilometer.compute.pollsters [-] 9dc942b5-8b65-4eb7-a57c-30d0a6221426/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b1d9083-d426-4e02-bd9b-930d5bc3a56b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_name': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_name': None, 'resource_id': 'instance-00000070-49096ce9-494c-4c84-b263-86a05230d8af-tap9b7c9dcb-22', 'timestamp': '2026-01-22T22:36:09.580098', 'resource_metadata': {'display_name': 'tempest-₡-1722590006', 'name': 'tap9b7c9dcb-22', 'instance_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'instance_type': 'm1.nano', 'host': '6ebe532796ec79092121b6b6d4c3b1b18421159623065b3cdd83ac93', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:aa:1b:c8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b7c9dcb-22'}, 'message_id': 'bf7d1b4e-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77257779, 'message_signature': 'f64d7a8cac024580384c8f60ef5210b8bf81a0c58314b9ee465f72c127ade0b0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '97ae504d8c4f43529c360266766791d0', 'user_name': None, 'project_id': '301c97a097c64afd8d55adb73fdd8cce', 'project_name': None, 'resource_id': 'instance-0000006f-b924048a-36af-45c3-80fd-9400d5975e6a-tapeecbb79f-fd', 'timestamp': '2026-01-22T22:36:09.580098', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-826657528', 'name': 'tapeecbb79f-fd', 'instance_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'instance_type': 'm1.nano', 'host': 'a7ab2bb2d15fe5a1fad8b3af24f5bb49e169a568625d6e483c82a878', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'stopped', 'state': 'shutdown', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:1f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeecbb79f-fd'}, 'message_id': 'bf7d2350-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.776636951, 'message_signature': '4388b2687cbd3cfbb4dfb96a3e0a529185f2e8634b2d4688067018f7f48359f2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006c-9dc942b5-8b65-4eb7-a57c-30d0a6221426-tap03c52ba6-92', 'timestamp': '2026-01-22T22:36:09.580098', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-721180198', 'name': 'tap03c52ba6-92', 'instance_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'instance_type': 'm1.nano', 'host': '0ae89c8d0776739b38e078e3b2b8ed3b70d2fdbe4d839a25950624b2', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:7f:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03c52ba6-92'}, 'message_id': 'bf7d2aee-f7e2-11f0-9a35-fa163e3d8874', 'monotonic_time': 5006.77941335, 'message_signature': '00cad78578cc7ab49178e9d170234390752011b9850677567601560bb49c3f74'}]}, 'timestamp': '2026-01-22 22:36:09.580757', '_unique_id': 'b2f2320b8e374966a5bf9db8371284b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.581 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.582 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.582 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:36:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:36:09.582 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1722590006>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-826657528>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-721180198>]
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.582 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.776 182729 DEBUG nova.compute.manager [req-7e04d1e1-4b09-40fa-bc55-75f94e0f810a req-42a2200b-4228-4153-8f24-9dff39370d67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.776 182729 DEBUG oslo_concurrency.lockutils [req-7e04d1e1-4b09-40fa-bc55-75f94e0f810a req-42a2200b-4228-4153-8f24-9dff39370d67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.777 182729 DEBUG oslo_concurrency.lockutils [req-7e04d1e1-4b09-40fa-bc55-75f94e0f810a req-42a2200b-4228-4153-8f24-9dff39370d67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.777 182729 DEBUG oslo_concurrency.lockutils [req-7e04d1e1-4b09-40fa-bc55-75f94e0f810a req-42a2200b-4228-4153-8f24-9dff39370d67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.777 182729 DEBUG nova.compute.manager [req-7e04d1e1-4b09-40fa-bc55-75f94e0f810a req-42a2200b-4228-4153-8f24-9dff39370d67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.777 182729 WARNING nova.compute.manager [req-7e04d1e1-4b09-40fa-bc55-75f94e0f810a req-42a2200b-4228-4153-8f24-9dff39370d67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state active and task_state powering-off.
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.934 182729 INFO nova.virt.libvirt.driver [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance shutdown successfully after 3 seconds.
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.940 182729 INFO nova.virt.libvirt.driver [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance destroyed successfully.
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.941 182729 DEBUG nova.objects.instance [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'numa_topology' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:09 compute-0 nova_compute[182725]: 2026-01-22 22:36:09.963 182729 DEBUG nova.compute.manager [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:10 compute-0 nova_compute[182725]: 2026-01-22 22:36:10.082 182729 DEBUG oslo_concurrency.lockutils [None req-f5d6f627-a81c-48c7-9348-07462e11e50c 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:11 compute-0 nova_compute[182725]: 2026-01-22 22:36:11.886 182729 DEBUG nova.compute.manager [req-39f49f6b-0778-415d-97ef-f855552de2a5 req-ce469341-b75f-40ca-9d0c-4ae2f97489f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:11 compute-0 nova_compute[182725]: 2026-01-22 22:36:11.887 182729 DEBUG oslo_concurrency.lockutils [req-39f49f6b-0778-415d-97ef-f855552de2a5 req-ce469341-b75f-40ca-9d0c-4ae2f97489f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:11 compute-0 nova_compute[182725]: 2026-01-22 22:36:11.887 182729 DEBUG oslo_concurrency.lockutils [req-39f49f6b-0778-415d-97ef-f855552de2a5 req-ce469341-b75f-40ca-9d0c-4ae2f97489f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:11 compute-0 nova_compute[182725]: 2026-01-22 22:36:11.888 182729 DEBUG oslo_concurrency.lockutils [req-39f49f6b-0778-415d-97ef-f855552de2a5 req-ce469341-b75f-40ca-9d0c-4ae2f97489f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:11 compute-0 nova_compute[182725]: 2026-01-22 22:36:11.888 182729 DEBUG nova.compute.manager [req-39f49f6b-0778-415d-97ef-f855552de2a5 req-ce469341-b75f-40ca-9d0c-4ae2f97489f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:11 compute-0 nova_compute[182725]: 2026-01-22 22:36:11.889 182729 WARNING nova.compute.manager [req-39f49f6b-0778-415d-97ef-f855552de2a5 req-ce469341-b75f-40ca-9d0c-4ae2f97489f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state stopped and task_state None.
Jan 22 22:36:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:12.443 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:12.444 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:12.444 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:14 compute-0 nova_compute[182725]: 2026-01-22 22:36:14.081 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:14 compute-0 nova_compute[182725]: 2026-01-22 22:36:14.583 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:15 compute-0 nova_compute[182725]: 2026-01-22 22:36:15.585 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'flavor' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:15 compute-0 nova_compute[182725]: 2026-01-22 22:36:15.610 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'info_cache' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:15 compute-0 nova_compute[182725]: 2026-01-22 22:36:15.637 182729 DEBUG oslo_concurrency.lockutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:36:15 compute-0 nova_compute[182725]: 2026-01-22 22:36:15.637 182729 DEBUG oslo_concurrency.lockutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:36:15 compute-0 nova_compute[182725]: 2026-01-22 22:36:15.637 182729 DEBUG nova.network.neutron [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.370 182729 DEBUG nova.network.neutron [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updating instance_info_cache with network_info: [{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.400 182729 DEBUG oslo_concurrency.lockutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.433 182729 INFO nova.virt.libvirt.driver [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance destroyed successfully.
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.434 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'numa_topology' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.448 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.471 182729 DEBUG nova.virt.libvirt.vif [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-826657528',display_name='tempest-ServerActionsTestJSON-server-826657528',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-826657528',id=111,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-3ujf49su',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:36:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=b924048a-36af-45c3-80fd-9400d5975e6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.472 182729 DEBUG nova.network.os_vif_util [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.472 182729 DEBUG nova.network.os_vif_util [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.473 182729 DEBUG os_vif [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.474 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.475 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeecbb79f-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.476 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.478 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.480 182729 INFO os_vif [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd')
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.486 182729 DEBUG nova.virt.libvirt.driver [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Start _get_guest_xml network_info=[{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.489 182729 WARNING nova.virt.libvirt.driver [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.494 182729 DEBUG nova.virt.libvirt.host [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.494 182729 DEBUG nova.virt.libvirt.host [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.497 182729 DEBUG nova.virt.libvirt.host [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.498 182729 DEBUG nova.virt.libvirt.host [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.499 182729 DEBUG nova.virt.libvirt.driver [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.499 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.499 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.499 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.500 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.500 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.500 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.500 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.500 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.501 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.501 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.501 182729 DEBUG nova.virt.hardware [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.501 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'vcpu_model' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.519 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.578 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.579 182729 DEBUG oslo_concurrency.lockutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.580 182729 DEBUG oslo_concurrency.lockutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.580 182729 DEBUG oslo_concurrency.lockutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.581 182729 DEBUG nova.virt.libvirt.vif [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-826657528',display_name='tempest-ServerActionsTestJSON-server-826657528',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-826657528',id=111,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-3ujf49su',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:36:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=b924048a-36af-45c3-80fd-9400d5975e6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.582 182729 DEBUG nova.network.os_vif_util [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.583 182729 DEBUG nova.network.os_vif_util [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.584 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.601 182729 DEBUG nova.virt.libvirt.driver [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <uuid>b924048a-36af-45c3-80fd-9400d5975e6a</uuid>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <name>instance-0000006f</name>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestJSON-server-826657528</nova:name>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:36:17</nova:creationTime>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:user uuid="97ae504d8c4f43529c360266766791d0">tempest-ServerActionsTestJSON-587323330-project-member</nova:user>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:project uuid="301c97a097c64afd8d55adb73fdd8cce">tempest-ServerActionsTestJSON-587323330</nova:project>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         <nova:port uuid="eecbb79f-fdf2-48a6-828b-d9fc528ec81f">
Jan 22 22:36:17 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <system>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <entry name="serial">b924048a-36af-45c3-80fd-9400d5975e6a</entry>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <entry name="uuid">b924048a-36af-45c3-80fd-9400d5975e6a</entry>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </system>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <os>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   </os>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <features>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   </features>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk.config"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:1f:1f:e7"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <target dev="tapeecbb79f-fd"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/console.log" append="off"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <video>
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </video>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:36:17 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:36:17 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:36:17 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:36:17 compute-0 nova_compute[182725]: </domain>
Jan 22 22:36:17 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.603 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.662 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.663 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.720 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.722 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'trusted_certs' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.735 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.790 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.791 182729 DEBUG nova.virt.disk.api [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Checking if we can resize image /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.792 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.884 182729 DEBUG oslo_concurrency.processutils [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.885 182729 DEBUG nova.virt.disk.api [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Cannot resize image /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.886 182729 DEBUG nova.objects.instance [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'migration_context' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.900 182729 DEBUG nova.virt.libvirt.vif [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-826657528',display_name='tempest-ServerActionsTestJSON-server-826657528',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-826657528',id=111,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-3ujf49su',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:36:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=b924048a-36af-45c3-80fd-9400d5975e6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.900 182729 DEBUG nova.network.os_vif_util [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.901 182729 DEBUG nova.network.os_vif_util [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.901 182729 DEBUG os_vif [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.902 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.902 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.903 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.905 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.905 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeecbb79f-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.906 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeecbb79f-fd, col_values=(('external_ids', {'iface-id': 'eecbb79f-fdf2-48a6-828b-d9fc528ec81f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:1f:e7', 'vm-uuid': 'b924048a-36af-45c3-80fd-9400d5975e6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.907 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:17 compute-0 NetworkManager[54954]: <info>  [1769121377.9087] manager: (tapeecbb79f-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.909 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.913 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:17 compute-0 nova_compute[182725]: 2026-01-22 22:36:17.914 182729 INFO os_vif [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd')
Jan 22 22:36:18 compute-0 kernel: tapeecbb79f-fd: entered promiscuous mode
Jan 22 22:36:18 compute-0 NetworkManager[54954]: <info>  [1769121378.0009] manager: (tapeecbb79f-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/207)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.020 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00428|binding|INFO|Claiming lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f for this chassis.
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00429|binding|INFO|eecbb79f-fdf2-48a6-828b-d9fc528ec81f: Claiming fa:16:3e:1f:1f:e7 10.100.0.4
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.033 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:1f:e7 10.100.0.4'], port_security=['fa:16:3e:1f:1f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=eecbb79f-fdf2-48a6-828b-d9fc528ec81f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.034 104215 INFO neutron.agent.ovn.metadata.agent [-] Port eecbb79f-fdf2-48a6-828b-d9fc528ec81f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.036 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.036 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00430|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f ovn-installed in OVS
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00431|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f up in Southbound
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.039 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.049 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[11e64b6c-52a3-45e3-b59c-322db7118f5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.050 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:36:18 compute-0 systemd-machined[154006]: New machine qemu-51-instance-0000006f.
Jan 22 22:36:18 compute-0 systemd-udevd[227564]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.053 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.054 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[249a00de-5081-44e8-8a02-10a516acdd46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.054 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[651ce625-9f90-4110-8008-d41ace5ed994]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 NetworkManager[54954]: <info>  [1769121378.0669] device (tapeecbb79f-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:36:18 compute-0 NetworkManager[54954]: <info>  [1769121378.0679] device (tapeecbb79f-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:36:18 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-0000006f.
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.068 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[11f17238-cbaf-4b1e-904a-e09f3aa98c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.095 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2d66386e-94c3-4099-82d7-bc288c43d2be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.125 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[12e4c5fd-d010-4cdf-b14a-007ec7d75475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.131 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[44b6aef3-9830-46b0-8659-f8c67c5c15f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 NetworkManager[54954]: <info>  [1769121378.1329] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/208)
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.169 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fe94d7aa-d7ab-448a-9812-4ba5059c4275]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.173 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b46b54db-4b3c-4b5f-a3d7-accb37691682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 NetworkManager[54954]: <info>  [1769121378.1977] device (tape65877e5-00): carrier: link connected
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.205 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc5bc33-3176-456a-b741-caa1ffcb3e27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.220 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3976912c-1c3d-48b2-a123-7fb40e2f9eea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501580, 'reachable_time': 41426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227596, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.236 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c3fed6-bf31-495d-b39e-ad78c06eae05]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501580, 'tstamp': 501580}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227597, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.251 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1d043bd2-75f7-4542-898f-b94066a83fc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501580, 'reachable_time': 41426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227598, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.284 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[00bea874-51bf-4bf8-8e32-0e851549a8ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.353 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fc8e09b4-de43-4a08-b5a5-044a3df1302e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.355 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.355 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.355 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:18 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:36:18 compute-0 NetworkManager[54954]: <info>  [1769121378.3581] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.357 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.359 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.360 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.361 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00432|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.362 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.362 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.363 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dc80bf34-2bc7-43c4-a4fe-7d9bc7f3d696]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.364 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.365 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.384 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.457 182729 DEBUG nova.compute.manager [req-c0e36b65-9ce2-4e1f-8adb-a0830005e68c req-d3c962d5-3497-4f68-be07-f9e0a850b832 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.457 182729 DEBUG oslo_concurrency.lockutils [req-c0e36b65-9ce2-4e1f-8adb-a0830005e68c req-d3c962d5-3497-4f68-be07-f9e0a850b832 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.458 182729 DEBUG oslo_concurrency.lockutils [req-c0e36b65-9ce2-4e1f-8adb-a0830005e68c req-d3c962d5-3497-4f68-be07-f9e0a850b832 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.458 182729 DEBUG oslo_concurrency.lockutils [req-c0e36b65-9ce2-4e1f-8adb-a0830005e68c req-d3c962d5-3497-4f68-be07-f9e0a850b832 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.458 182729 DEBUG nova.compute.manager [req-c0e36b65-9ce2-4e1f-8adb-a0830005e68c req-d3c962d5-3497-4f68-be07-f9e0a850b832 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.458 182729 WARNING nova.compute.manager [req-c0e36b65-9ce2-4e1f-8adb-a0830005e68c req-d3c962d5-3497-4f68-be07-f9e0a850b832 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state stopped and task_state powering-on.
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.467 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.467 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.467 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.468 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.468 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.485 182729 INFO nova.compute.manager [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Terminating instance
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.502 182729 DEBUG nova.compute.manager [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.509 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for b924048a-36af-45c3-80fd-9400d5975e6a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.509 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121378.5066159, b924048a-36af-45c3-80fd-9400d5975e6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.510 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Resumed (Lifecycle Event)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.511 182729 DEBUG nova.compute.manager [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.515 182729 INFO nova.virt.libvirt.driver [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance rebooted successfully.
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.515 182729 DEBUG nova.compute.manager [None req-e38c6b59-155a-48d6-aafa-4d6b4c2e863e 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:18 compute-0 kernel: tap03c52ba6-92 (unregistering): left promiscuous mode
Jan 22 22:36:18 compute-0 NetworkManager[54954]: <info>  [1769121378.5285] device (tap03c52ba6-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00433|binding|INFO|Releasing lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 from this chassis (sb_readonly=0)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00434|binding|INFO|Setting lport 03c52ba6-920a-4801-bb3e-c99e3203ab13 down in Southbound
Jan 22 22:36:18 compute-0 ovn_controller[94850]: 2026-01-22T22:36:18Z|00435|binding|INFO|Removing iface tap03c52ba6-92 ovn-installed in OVS
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.539 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.549 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.549 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:7f:79 10.100.0.5'], port_security=['fa:16:3e:6e:7f:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9dc942b5-8b65-4eb7-a57c-30d0a6221426', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '8', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=03c52ba6-920a-4801-bb3e-c99e3203ab13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.552 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.556 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.585 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121378.5085855, b924048a-36af-45c3-80fd-9400d5975e6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.585 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Started (Lifecycle Event)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.603 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:18 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Jan 22 22:36:18 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006c.scope: Consumed 13.962s CPU time.
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.610 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:36:18 compute-0 systemd-machined[154006]: Machine qemu-48-instance-0000006c terminated.
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.778 182729 INFO nova.virt.libvirt.driver [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Instance destroyed successfully.
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.778 182729 DEBUG nova.objects.instance [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'resources' on Instance uuid 9dc942b5-8b65-4eb7-a57c-30d0a6221426 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:18 compute-0 podman[227643]: 2026-01-22 22:36:18.793864107 +0000 UTC m=+0.056916041 container create b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.795 182729 DEBUG nova.virt.libvirt.vif [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-721180198',display_name='tempest-ServerStableDeviceRescueTest-server-721180198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-721180198',id=108,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-76kjeq8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:35:22Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=9dc942b5-8b65-4eb7-a57c-30d0a6221426,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.796 182729 DEBUG nova.network.os_vif_util [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "address": "fa:16:3e:6e:7f:79", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03c52ba6-92", "ovs_interfaceid": "03c52ba6-920a-4801-bb3e-c99e3203ab13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.796 182729 DEBUG nova.network.os_vif_util [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.797 182729 DEBUG os_vif [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.799 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.799 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03c52ba6-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.801 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.803 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.804 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.806 182729 INFO os_vif [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:7f:79,bridge_name='br-int',has_traffic_filtering=True,id=03c52ba6-920a-4801-bb3e-c99e3203ab13,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03c52ba6-92')
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.807 182729 INFO nova.virt.libvirt.driver [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Deleting instance files /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426_del
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.808 182729 INFO nova.virt.libvirt.driver [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Deletion of /var/lib/nova/instances/9dc942b5-8b65-4eb7-a57c-30d0a6221426_del complete
Jan 22 22:36:18 compute-0 systemd[1]: Started libpod-conmon-b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0.scope.
Jan 22 22:36:18 compute-0 podman[227643]: 2026-01-22 22:36:18.761828708 +0000 UTC m=+0.024880652 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:36:18 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:36:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958c21cda2fda227408ecee792eaeb8dad02d1df7712dfb9b0ebfeb69c2adea6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.871 182729 DEBUG nova.compute.manager [req-b441c678-9eee-4f77-bd39-43df8735357b req-7b2d419e-0f89-4934-9c26-25413e52cb1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.872 182729 DEBUG oslo_concurrency.lockutils [req-b441c678-9eee-4f77-bd39-43df8735357b req-7b2d419e-0f89-4934-9c26-25413e52cb1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.872 182729 DEBUG oslo_concurrency.lockutils [req-b441c678-9eee-4f77-bd39-43df8735357b req-7b2d419e-0f89-4934-9c26-25413e52cb1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.872 182729 DEBUG oslo_concurrency.lockutils [req-b441c678-9eee-4f77-bd39-43df8735357b req-7b2d419e-0f89-4934-9c26-25413e52cb1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.872 182729 DEBUG nova.compute.manager [req-b441c678-9eee-4f77-bd39-43df8735357b req-7b2d419e-0f89-4934-9c26-25413e52cb1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.872 182729 DEBUG nova.compute.manager [req-b441c678-9eee-4f77-bd39-43df8735357b req-7b2d419e-0f89-4934-9c26-25413e52cb1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-unplugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:36:18 compute-0 podman[227643]: 2026-01-22 22:36:18.879752861 +0000 UTC m=+0.142804805 container init b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.880 182729 INFO nova.compute.manager [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.881 182729 DEBUG oslo.service.loopingcall [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.881 182729 DEBUG nova.compute.manager [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:36:18 compute-0 nova_compute[182725]: 2026-01-22 22:36:18.881 182729 DEBUG nova.network.neutron [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:36:18 compute-0 podman[227643]: 2026-01-22 22:36:18.885231298 +0000 UTC m=+0.148283242 container start b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:36:18 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227673]: [NOTICE]   (227677) : New worker (227679) forked
Jan 22 22:36:18 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227673]: [NOTICE]   (227677) : Loading success.
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.935 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 03c52ba6-920a-4801-bb3e-c99e3203ab13 in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.937 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.938 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5fd22b-4618-4ec6-8b46-2817d253396c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:18.939 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace which is not needed anymore
Jan 22 22:36:19 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226779]: [NOTICE]   (226783) : haproxy version is 2.8.14-c23fe91
Jan 22 22:36:19 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226779]: [NOTICE]   (226783) : path to executable is /usr/sbin/haproxy
Jan 22 22:36:19 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226779]: [WARNING]  (226783) : Exiting Master process...
Jan 22 22:36:19 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226779]: [ALERT]    (226783) : Current worker (226785) exited with code 143 (Terminated)
Jan 22 22:36:19 compute-0 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226779]: [WARNING]  (226783) : All workers exited. Exiting... (0)
Jan 22 22:36:19 compute-0 systemd[1]: libpod-f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba.scope: Deactivated successfully.
Jan 22 22:36:19 compute-0 podman[227705]: 2026-01-22 22:36:19.074103823 +0000 UTC m=+0.052612615 container died f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.083 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba-userdata-shm.mount: Deactivated successfully.
Jan 22 22:36:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-9e8b88bef2adf302a173b6957c33fbebb042392c4213094c48f230deb845ffbf-merged.mount: Deactivated successfully.
Jan 22 22:36:19 compute-0 podman[227705]: 2026-01-22 22:36:19.114771178 +0000 UTC m=+0.093279940 container cleanup f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 22:36:19 compute-0 systemd[1]: libpod-conmon-f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba.scope: Deactivated successfully.
Jan 22 22:36:19 compute-0 podman[227738]: 2026-01-22 22:36:19.196453077 +0000 UTC m=+0.059715712 container remove f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.202 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[216d00f6-6670-4f91-ba5c-a1fb7810742d]: (4, ('Thu Jan 22 10:36:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba)\nf3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba\nThu Jan 22 10:36:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (f3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba)\nf3b855987e659d78d599996f92e33d569c20ebddf69ab9f61b66446f45ea75ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.205 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[629e4242-5187-44ac-a2d0-73e9d4b93b6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.206 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.208 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:19 compute-0 kernel: tapad2345e3-00: left promiscuous mode
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.221 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.224 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.227 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[afc2812c-db69-42d2-a62b-326a08f97726]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.250 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[da4e88c6-2237-4287-9150-34461be61a9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.253 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[40ceaeeb-7ef1-457a-ae83-03ecc62482bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.280 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d51901c6-0b93-4544-ba2c-631288c5a63e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496017, 'reachable_time': 17224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227753, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.283 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:36:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:19.283 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[44244136-578e-4db8-9dcb-d236ae1c6b06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:19 compute-0 systemd[1]: run-netns-ovnmeta\x2dad2345e3\x2d0b74\x2d4aee\x2daa42\x2dda6620725bb2.mount: Deactivated successfully.
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.562 182729 DEBUG nova.network.neutron [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.583 182729 INFO nova.compute.manager [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Took 0.70 seconds to deallocate network for instance.
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.684 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.686 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.726 182729 DEBUG nova.scheduler.client.report [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.743 182729 DEBUG nova.scheduler.client.report [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.743 182729 DEBUG nova.compute.provider_tree [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.755 182729 DEBUG nova.scheduler.client.report [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.788 182729 DEBUG nova.scheduler.client.report [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.877 182729 DEBUG nova.compute.provider_tree [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.895 182729 DEBUG nova.scheduler.client.report [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.922 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:19 compute-0 nova_compute[182725]: 2026-01-22 22:36:19.954 182729 INFO nova.scheduler.client.report [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Deleted allocations for instance 9dc942b5-8b65-4eb7-a57c-30d0a6221426
Jan 22 22:36:20 compute-0 nova_compute[182725]: 2026-01-22 22:36:20.062 182729 DEBUG oslo_concurrency.lockutils [None req-7c2ca365-11e1-45c0-99d3-0620e41a1857 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:20 compute-0 nova_compute[182725]: 2026-01-22 22:36:20.850 182729 DEBUG nova.compute.manager [req-cbc5464d-cf76-4d17-b479-9793bc8c9fdc req-53422efe-f25c-4c3b-b188-2afb6e8fc687 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:20 compute-0 nova_compute[182725]: 2026-01-22 22:36:20.850 182729 DEBUG oslo_concurrency.lockutils [req-cbc5464d-cf76-4d17-b479-9793bc8c9fdc req-53422efe-f25c-4c3b-b188-2afb6e8fc687 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:20 compute-0 nova_compute[182725]: 2026-01-22 22:36:20.850 182729 DEBUG oslo_concurrency.lockutils [req-cbc5464d-cf76-4d17-b479-9793bc8c9fdc req-53422efe-f25c-4c3b-b188-2afb6e8fc687 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:20 compute-0 nova_compute[182725]: 2026-01-22 22:36:20.851 182729 DEBUG oslo_concurrency.lockutils [req-cbc5464d-cf76-4d17-b479-9793bc8c9fdc req-53422efe-f25c-4c3b-b188-2afb6e8fc687 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:20 compute-0 nova_compute[182725]: 2026-01-22 22:36:20.851 182729 DEBUG nova.compute.manager [req-cbc5464d-cf76-4d17-b479-9793bc8c9fdc req-53422efe-f25c-4c3b-b188-2afb6e8fc687 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:20 compute-0 nova_compute[182725]: 2026-01-22 22:36:20.851 182729 WARNING nova.compute.manager [req-cbc5464d-cf76-4d17-b479-9793bc8c9fdc req-53422efe-f25c-4c3b-b188-2afb6e8fc687 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state active and task_state None.
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.022 182729 DEBUG nova.compute.manager [req-232cfdfd-4164-4698-a2ec-4b732ab5686d req-ba8f64e7-75d4-48cd-b65c-58f0e8f63e65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.022 182729 DEBUG oslo_concurrency.lockutils [req-232cfdfd-4164-4698-a2ec-4b732ab5686d req-ba8f64e7-75d4-48cd-b65c-58f0e8f63e65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.023 182729 DEBUG oslo_concurrency.lockutils [req-232cfdfd-4164-4698-a2ec-4b732ab5686d req-ba8f64e7-75d4-48cd-b65c-58f0e8f63e65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.023 182729 DEBUG oslo_concurrency.lockutils [req-232cfdfd-4164-4698-a2ec-4b732ab5686d req-ba8f64e7-75d4-48cd-b65c-58f0e8f63e65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9dc942b5-8b65-4eb7-a57c-30d0a6221426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.024 182729 DEBUG nova.compute.manager [req-232cfdfd-4164-4698-a2ec-4b732ab5686d req-ba8f64e7-75d4-48cd-b65c-58f0e8f63e65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] No waiting events found dispatching network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.024 182729 WARNING nova.compute.manager [req-232cfdfd-4164-4698-a2ec-4b732ab5686d req-ba8f64e7-75d4-48cd-b65c-58f0e8f63e65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received unexpected event network-vif-plugged-03c52ba6-920a-4801-bb3e-c99e3203ab13 for instance with vm_state deleted and task_state None.
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.024 182729 DEBUG nova.compute.manager [req-232cfdfd-4164-4698-a2ec-4b732ab5686d req-ba8f64e7-75d4-48cd-b65c-58f0e8f63e65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Received event network-vif-deleted-03c52ba6-920a-4801-bb3e-c99e3203ab13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:21 compute-0 podman[227754]: 2026-01-22 22:36:21.185773912 +0000 UTC m=+0.111516114 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:36:21 compute-0 nova_compute[182725]: 2026-01-22 22:36:21.919 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.128 182729 DEBUG nova.objects.instance [None req-bcf642d9-53fd-4556-9034-0b1086a4112b 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.148 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121382.1481745, b924048a-36af-45c3-80fd-9400d5975e6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.148 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Paused (Lifecycle Event)
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.180 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.185 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.203 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 22 22:36:22 compute-0 kernel: tapeecbb79f-fd (unregistering): left promiscuous mode
Jan 22 22:36:22 compute-0 NetworkManager[54954]: <info>  [1769121382.8513] device (tapeecbb79f-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:36:22 compute-0 ovn_controller[94850]: 2026-01-22T22:36:22Z|00436|binding|INFO|Releasing lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f from this chassis (sb_readonly=0)
Jan 22 22:36:22 compute-0 ovn_controller[94850]: 2026-01-22T22:36:22Z|00437|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f down in Southbound
Jan 22 22:36:22 compute-0 ovn_controller[94850]: 2026-01-22T22:36:22Z|00438|binding|INFO|Removing iface tapeecbb79f-fd ovn-installed in OVS
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.859 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.861 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:22.870 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:1f:e7 10.100.0.4'], port_security=['fa:16:3e:1f:1f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=eecbb79f-fdf2-48a6-828b-d9fc528ec81f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:36:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:22.871 104215 INFO neutron.agent.ovn.metadata.agent [-] Port eecbb79f-fdf2-48a6-828b-d9fc528ec81f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:36:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:22.872 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.873 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:22.874 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c00ee5d1-2144-4968-882c-1e228dd55fce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:22 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:22.874 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:22 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 22 22:36:22 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006f.scope: Consumed 4.280s CPU time.
Jan 22 22:36:22 compute-0 systemd-machined[154006]: Machine qemu-51-instance-0000006f terminated.
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.916 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:22 compute-0 nova_compute[182725]: 2026-01-22 22:36:22.917 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:36:22 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227673]: [NOTICE]   (227677) : haproxy version is 2.8.14-c23fe91
Jan 22 22:36:22 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227673]: [NOTICE]   (227677) : path to executable is /usr/sbin/haproxy
Jan 22 22:36:22 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227673]: [WARNING]  (227677) : Exiting Master process...
Jan 22 22:36:22 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227673]: [ALERT]    (227677) : Current worker (227679) exited with code 143 (Terminated)
Jan 22 22:36:22 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[227673]: [WARNING]  (227677) : All workers exited. Exiting... (0)
Jan 22 22:36:22 compute-0 systemd[1]: libpod-b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0.scope: Deactivated successfully.
Jan 22 22:36:23 compute-0 podman[227800]: 2026-01-22 22:36:23.001010193 +0000 UTC m=+0.045610089 container died b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:36:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-958c21cda2fda227408ecee792eaeb8dad02d1df7712dfb9b0ebfeb69c2adea6-merged.mount: Deactivated successfully.
Jan 22 22:36:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0-userdata-shm.mount: Deactivated successfully.
Jan 22 22:36:23 compute-0 podman[227800]: 2026-01-22 22:36:23.032643693 +0000 UTC m=+0.077243579 container cleanup b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:36:23 compute-0 systemd[1]: libpod-conmon-b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0.scope: Deactivated successfully.
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.093 182729 DEBUG nova.compute.manager [None req-bcf642d9-53fd-4556-9034-0b1086a4112b 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:23 compute-0 podman[227830]: 2026-01-22 22:36:23.099631015 +0000 UTC m=+0.045437055 container remove b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.107 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5d25e14b-3639-44e5-b07a-acc5c5b55f37]: (4, ('Thu Jan 22 10:36:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0)\nb7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0\nThu Jan 22 10:36:23 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (b7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0)\nb7bfe7ce20955fda8b06626e541ef7ea41cf0431edd402184e91d6dbe7aa99a0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.109 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf75677-fefc-435a-8a45-5f43249d3755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.111 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.113 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:23 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.132 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.137 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[070194ad-a7a6-4f3d-86f5-159ddb4989f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.152 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[35378985-113d-4026-a24c-c536de408979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.154 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[07197be0-1ce5-4bb5-b9b8-ca6ed3e7b1d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.168 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3b728e62-c00c-485c-a81b-bb68f9e54662]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501572, 'reachable_time': 16724, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227866, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.173 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:36:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:23.173 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4b424e47-a97e-4d70-ae7b-ec2d5868a117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:23 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.217 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.276 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.277 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.328 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.333 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.391 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.392 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.452 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.608 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.609 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5378MB free_disk=73.21018600463867GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.609 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.609 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.672 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance b924048a-36af-45c3-80fd-9400d5975e6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.673 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 49096ce9-494c-4c84-b263-86a05230d8af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.673 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.674 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.728 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.749 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.780 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.781 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:23 compute-0 nova_compute[182725]: 2026-01-22 22:36:23.802 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:24 compute-0 nova_compute[182725]: 2026-01-22 22:36:24.086 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.095 182729 INFO nova.compute.manager [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Resuming
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.098 182729 DEBUG nova.objects.instance [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'flavor' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.147 182729 DEBUG oslo_concurrency.lockutils [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.147 182729 DEBUG oslo_concurrency.lockutils [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquired lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.147 182729 DEBUG nova.network.neutron [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.409 182729 DEBUG nova.compute.manager [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.409 182729 DEBUG oslo_concurrency.lockutils [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.409 182729 DEBUG oslo_concurrency.lockutils [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.410 182729 DEBUG oslo_concurrency.lockutils [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.410 182729 DEBUG nova.compute.manager [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.410 182729 WARNING nova.compute.manager [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state suspended and task_state resuming.
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.410 182729 DEBUG nova.compute.manager [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.411 182729 DEBUG oslo_concurrency.lockutils [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.411 182729 DEBUG oslo_concurrency.lockutils [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.411 182729 DEBUG oslo_concurrency.lockutils [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.411 182729 DEBUG nova.compute.manager [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.412 182729 WARNING nova.compute.manager [req-41e5d9ff-9f7c-4eeb-adf0-c781bfc5cbd8 req-00074105-4b4f-4fec-8437-e4f01b3373aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state suspended and task_state resuming.
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.781 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.782 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:25 compute-0 nova_compute[182725]: 2026-01-22 22:36:25.782 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.586 182729 DEBUG nova.network.neutron [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updating instance_info_cache with network_info: [{"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.603 182729 DEBUG oslo_concurrency.lockutils [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Releasing lock "refresh_cache-b924048a-36af-45c3-80fd-9400d5975e6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.608 182729 DEBUG nova.virt.libvirt.vif [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-826657528',display_name='tempest-ServerActionsTestJSON-server-826657528',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-826657528',id=111,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-3ujf49su',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:36:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=b924048a-36af-45c3-80fd-9400d5975e6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.609 182729 DEBUG nova.network.os_vif_util [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.609 182729 DEBUG nova.network.os_vif_util [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.610 182729 DEBUG os_vif [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.610 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.611 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.611 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.613 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.614 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeecbb79f-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.614 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeecbb79f-fd, col_values=(('external_ids', {'iface-id': 'eecbb79f-fdf2-48a6-828b-d9fc528ec81f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:1f:e7', 'vm-uuid': 'b924048a-36af-45c3-80fd-9400d5975e6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.614 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.615 182729 INFO os_vif [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd')
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.640 182729 DEBUG nova.objects.instance [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'numa_topology' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:26 compute-0 kernel: tapeecbb79f-fd: entered promiscuous mode
Jan 22 22:36:26 compute-0 NetworkManager[54954]: <info>  [1769121386.7266] manager: (tapeecbb79f-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/210)
Jan 22 22:36:26 compute-0 ovn_controller[94850]: 2026-01-22T22:36:26Z|00439|binding|INFO|Claiming lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f for this chassis.
Jan 22 22:36:26 compute-0 ovn_controller[94850]: 2026-01-22T22:36:26Z|00440|binding|INFO|eecbb79f-fdf2-48a6-828b-d9fc528ec81f: Claiming fa:16:3e:1f:1f:e7 10.100.0.4
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.730 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.738 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:1f:e7 10.100.0.4'], port_security=['fa:16:3e:1f:1f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=eecbb79f-fdf2-48a6-828b-d9fc528ec81f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.739 104215 INFO neutron.agent.ovn.metadata.agent [-] Port eecbb79f-fdf2-48a6-828b-d9fc528ec81f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a bound to our chassis
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.740 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.745 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.748 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:26 compute-0 ovn_controller[94850]: 2026-01-22T22:36:26Z|00441|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f ovn-installed in OVS
Jan 22 22:36:26 compute-0 ovn_controller[94850]: 2026-01-22T22:36:26Z|00442|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f up in Southbound
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.755 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.757 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82af3f48-5de0-4baa-8b94-b87407eedcbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.758 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65877e5-01 in ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.760 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65877e5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.760 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[228f82fb-645a-41f5-95b8-325394fc3c55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.761 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[25da4645-eb46-4944-8d87-6f8e0a70b0b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.772 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[e9aa7630-66b1-4429-adef-9143f7e4ccd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 systemd-udevd[227923]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:36:26 compute-0 systemd-machined[154006]: New machine qemu-52-instance-0000006f.
Jan 22 22:36:26 compute-0 NetworkManager[54954]: <info>  [1769121386.7880] device (tapeecbb79f-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:36:26 compute-0 NetworkManager[54954]: <info>  [1769121386.7885] device (tapeecbb79f-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:36:26 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-0000006f.
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.797 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dd60750b-8e08-40fc-984f-4939470dd79b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 podman[227887]: 2026-01-22 22:36:26.807589711 +0000 UTC m=+0.088388038 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6)
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.827 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ce6306-bd12-4c8d-90c4-269c462150b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.832 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[20eb84a6-8986-4822-91f2-b7e9ab12ef3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 NetworkManager[54954]: <info>  [1769121386.8340] manager: (tape65877e5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/211)
Jan 22 22:36:26 compute-0 podman[227886]: 2026-01-22 22:36:26.841550298 +0000 UTC m=+0.123280218 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:36:26 compute-0 ovn_controller[94850]: 2026-01-22T22:36:26Z|00443|binding|INFO|Releasing lport 48ec79a0-32c2-47c3-bffb-8836aa917258 from this chassis (sb_readonly=0)
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.870 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe78223-8e96-47c2-a047-eb0cc3a2c9cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.874 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8dbc97fe-a77d-4c30-be04-dff490e1f518]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 NetworkManager[54954]: <info>  [1769121386.8967] device (tape65877e5-00): carrier: link connected
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.903 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc8269d-4598-43d6-a862-e73eabca02b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.925 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2b779c98-7b91-4c3b-8966-c9dc6973c2db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502449, 'reachable_time': 16554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227972, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 nova_compute[182725]: 2026-01-22 22:36:26.932 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.942 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8bee5872-db52-4b94-870c-adb08aa85324]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:bc8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 502449, 'tstamp': 502449}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227973, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.960 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b8b10a-0852-41a6-944d-634f2bc2bc8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65877e5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:bc:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502449, 'reachable_time': 16554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227976, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:26.991 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a14ebc-6512-45a2-b61f-be944a71bd2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.063 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7799a9cc-7fcc-4cb1-9779-d3011ee5b322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.065 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.065 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.065 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65877e5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.067 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:27 compute-0 NetworkManager[54954]: <info>  [1769121387.0684] manager: (tape65877e5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Jan 22 22:36:27 compute-0 kernel: tape65877e5-00: entered promiscuous mode
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.070 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.073 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65877e5-00, col_values=(('external_ids', {'iface-id': '647d0114-938c-4844-a021-90b026a7e1fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.075 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:27 compute-0 ovn_controller[94850]: 2026-01-22T22:36:27Z|00444|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.076 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.076 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.078 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[faba9b95-9f2b-4b85-9f2e-039e4e907d2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.079 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e65877e5-0f40-472b-b31d-f3266eff5b5a.pid.haproxy
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e65877e5-0f40-472b-b31d-f3266eff5b5a
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:36:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:27.080 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'env', 'PROCESS_TAG=haproxy-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65877e5-0f40-472b-b31d-f3266eff5b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.087 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.120 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for b924048a-36af-45c3-80fd-9400d5975e6a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.120 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121387.1192083, b924048a-36af-45c3-80fd-9400d5975e6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.121 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Started (Lifecycle Event)
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.142 182729 DEBUG nova.compute.manager [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.143 182729 DEBUG nova.objects.instance [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'pci_devices' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.157 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.163 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.171 182729 INFO nova.virt.libvirt.driver [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance running successfully.
Jan 22 22:36:27 compute-0 virtqemud[182297]: argument unsupported: QEMU guest agent is not configured
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.175 182729 DEBUG nova.virt.libvirt.guest [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.175 182729 DEBUG nova.compute.manager [None req-8abec8ee-f9ae-4ee4-97c7-3169bfdbe874 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.190 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.191 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121387.125249, b924048a-36af-45c3-80fd-9400d5975e6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.191 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Resumed (Lifecycle Event)
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.214 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.219 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:36:27 compute-0 ovn_controller[94850]: 2026-01-22T22:36:27Z|00445|binding|INFO|Releasing lport 48ec79a0-32c2-47c3-bffb-8836aa917258 from this chassis (sb_readonly=0)
Jan 22 22:36:27 compute-0 ovn_controller[94850]: 2026-01-22T22:36:27Z|00446|binding|INFO|Releasing lport 647d0114-938c-4844-a021-90b026a7e1fc from this chassis (sb_readonly=0)
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.365 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:27 compute-0 podman[228013]: 2026-01-22 22:36:27.463166095 +0000 UTC m=+0.053740683 container create ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.493 182729 DEBUG nova.compute.manager [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.493 182729 DEBUG oslo_concurrency.lockutils [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.493 182729 DEBUG oslo_concurrency.lockutils [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.493 182729 DEBUG oslo_concurrency.lockutils [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.493 182729 DEBUG nova.compute.manager [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.494 182729 WARNING nova.compute.manager [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state active and task_state None.
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.494 182729 DEBUG nova.compute.manager [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.494 182729 DEBUG oslo_concurrency.lockutils [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.494 182729 DEBUG oslo_concurrency.lockutils [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.494 182729 DEBUG oslo_concurrency.lockutils [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.494 182729 DEBUG nova.compute.manager [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.495 182729 WARNING nova.compute.manager [req-afe5cb64-01d0-4456-9440-c7c6a24f7cf9 req-2528ba27-700e-44da-8a13-9e2252795e2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state active and task_state None.
Jan 22 22:36:27 compute-0 systemd[1]: Started libpod-conmon-ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49.scope.
Jan 22 22:36:27 compute-0 podman[228013]: 2026-01-22 22:36:27.432615912 +0000 UTC m=+0.023190530 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:36:27 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:36:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a90015b4c604682a893a73191f284a74ec208c732da7b555e76cc7ff3acae42/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:36:27 compute-0 podman[228013]: 2026-01-22 22:36:27.563016697 +0000 UTC m=+0.153591305 container init ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 22:36:27 compute-0 podman[228013]: 2026-01-22 22:36:27.569677503 +0000 UTC m=+0.160252091 container start ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:36:27 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [NOTICE]   (228032) : New worker (228034) forked
Jan 22 22:36:27 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [NOTICE]   (228032) : Loading success.
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:27 compute-0 nova_compute[182725]: 2026-01-22 22:36:27.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:36:28 compute-0 nova_compute[182725]: 2026-01-22 22:36:28.805 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.089 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.943 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.943 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.943 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.944 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.944 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.959 182729 INFO nova.compute.manager [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Terminating instance
Jan 22 22:36:29 compute-0 nova_compute[182725]: 2026-01-22 22:36:29.975 182729 DEBUG nova.compute.manager [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:36:30 compute-0 kernel: tapeecbb79f-fd (unregistering): left promiscuous mode
Jan 22 22:36:30 compute-0 NetworkManager[54954]: <info>  [1769121390.0155] device (tapeecbb79f-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.018 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:30 compute-0 ovn_controller[94850]: 2026-01-22T22:36:30Z|00447|binding|INFO|Releasing lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f from this chassis (sb_readonly=0)
Jan 22 22:36:30 compute-0 ovn_controller[94850]: 2026-01-22T22:36:30Z|00448|binding|INFO|Setting lport eecbb79f-fdf2-48a6-828b-d9fc528ec81f down in Southbound
Jan 22 22:36:30 compute-0 ovn_controller[94850]: 2026-01-22T22:36:30Z|00449|binding|INFO|Removing iface tapeecbb79f-fd ovn-installed in OVS
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.033 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:1f:e7 10.100.0.4'], port_security=['fa:16:3e:1f:1f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b924048a-36af-45c3-80fd-9400d5975e6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301c97a097c64afd8d55adb73fdd8cce', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'df09bfd1-e013-4e46-8445-f536aa6ced57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.190', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc9f3b29-b90f-4f9b-8f95-4c1f24f8566a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=eecbb79f-fdf2-48a6-828b-d9fc528ec81f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.035 104215 INFO neutron.agent.ovn.metadata.agent [-] Port eecbb79f-fdf2-48a6-828b-d9fc528ec81f in datapath e65877e5-0f40-472b-b31d-f3266eff5b5a unbound from our chassis
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.036 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65877e5-0f40-472b-b31d-f3266eff5b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.036 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.037 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3b5dca-daaf-4dad-83ef-6e8f51423947]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.037 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a namespace which is not needed anymore
Jan 22 22:36:30 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 22 22:36:30 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006f.scope: Consumed 3.285s CPU time.
Jan 22 22:36:30 compute-0 systemd-machined[154006]: Machine qemu-52-instance-0000006f terminated.
Jan 22 22:36:30 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [NOTICE]   (228032) : haproxy version is 2.8.14-c23fe91
Jan 22 22:36:30 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [NOTICE]   (228032) : path to executable is /usr/sbin/haproxy
Jan 22 22:36:30 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [WARNING]  (228032) : Exiting Master process...
Jan 22 22:36:30 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [WARNING]  (228032) : Exiting Master process...
Jan 22 22:36:30 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [ALERT]    (228032) : Current worker (228034) exited with code 143 (Terminated)
Jan 22 22:36:30 compute-0 neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a[228028]: [WARNING]  (228032) : All workers exited. Exiting... (0)
Jan 22 22:36:30 compute-0 systemd[1]: libpod-ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49.scope: Deactivated successfully.
Jan 22 22:36:30 compute-0 podman[228068]: 2026-01-22 22:36:30.163092848 +0000 UTC m=+0.040418480 container died ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:36:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49-userdata-shm.mount: Deactivated successfully.
Jan 22 22:36:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-0a90015b4c604682a893a73191f284a74ec208c732da7b555e76cc7ff3acae42-merged.mount: Deactivated successfully.
Jan 22 22:36:30 compute-0 podman[228068]: 2026-01-22 22:36:30.196765209 +0000 UTC m=+0.074090841 container cleanup ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:36:30 compute-0 systemd[1]: libpod-conmon-ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49.scope: Deactivated successfully.
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.237 182729 INFO nova.virt.libvirt.driver [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Instance destroyed successfully.
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.238 182729 DEBUG nova.objects.instance [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lazy-loading 'resources' on Instance uuid b924048a-36af-45c3-80fd-9400d5975e6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.259 182729 DEBUG nova.virt.libvirt.vif [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-826657528',display_name='tempest-ServerActionsTestJSON-server-826657528',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-826657528',id=111,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIr/CBfiBeuMG9E3FIdJaaDFcpns/SWHskci9YhszHBYlth/xLx7qG/YmMmFu//p+hP6RskFOI2PKF0b+hRmxEollXh1AkE9rLD3jmP470P0AlDZN0YrDWBYIuqfVil9wQ==',key_name='tempest-keypair-2079051498',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301c97a097c64afd8d55adb73fdd8cce',ramdisk_id='',reservation_id='r-3ujf49su',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-587323330',owner_user_name='tempest-ServerActionsTestJSON-587323330-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:36:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='97ae504d8c4f43529c360266766791d0',uuid=b924048a-36af-45c3-80fd-9400d5975e6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.260 182729 DEBUG nova.network.os_vif_util [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converting VIF {"id": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "address": "fa:16:3e:1f:1f:e7", "network": {"id": "e65877e5-0f40-472b-b31d-f3266eff5b5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-546014484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301c97a097c64afd8d55adb73fdd8cce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeecbb79f-fd", "ovs_interfaceid": "eecbb79f-fdf2-48a6-828b-d9fc528ec81f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.261 182729 DEBUG nova.network.os_vif_util [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:36:30 compute-0 podman[228106]: 2026-01-22 22:36:30.261018953 +0000 UTC m=+0.042466881 container remove ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.261 182729 DEBUG os_vif [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.263 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.263 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeecbb79f-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.265 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.267 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.268 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2b52e5b5-0aaa-4e88-a7f7-b5a3071428f4]: (4, ('Thu Jan 22 10:36:30 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49)\nff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49\nThu Jan 22 10:36:30 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a (ff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49)\nff0b359024cf938d2ead33da89b3530dc9fefc951421bf688510826e76aacb49\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.269 182729 INFO os_vif [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=eecbb79f-fdf2-48a6-828b-d9fc528ec81f,network=Network(e65877e5-0f40-472b-b31d-f3266eff5b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeecbb79f-fd')
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.270 182729 INFO nova.virt.libvirt.driver [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Deleting instance files /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a_del
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.270 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6316c515-a10a-412e-8471-75ad5705ce0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.270 182729 INFO nova.virt.libvirt.driver [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Deletion of /var/lib/nova/instances/b924048a-36af-45c3-80fd-9400d5975e6a_del complete
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.271 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65877e5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:36:30 compute-0 kernel: tape65877e5-00: left promiscuous mode
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.274 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.275 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82b1bfe5-b2e7-48c8-89e3-5e2a7489e8c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.284 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.301 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e9b6db-4a28-41af-b58d-4422c94cdf6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.302 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e42e2b56-2a82-42b0-97d2-898eee4bca2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.318 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2f756a96-546d-4716-9ff1-cd5bf4c8a667]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502442, 'reachable_time': 23758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228131, 'error': None, 'target': 'ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 systemd[1]: run-netns-ovnmeta\x2de65877e5\x2d0f40\x2d472b\x2db31d\x2df3266eff5b5a.mount: Deactivated successfully.
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.321 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65877e5-0f40-472b-b31d-f3266eff5b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:36:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:36:30.321 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[0e61683e-1a80-4343-a7a5-61a04800650a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.338 182729 INFO nova.compute.manager [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.339 182729 DEBUG oslo.service.loopingcall [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.339 182729 DEBUG nova.compute.manager [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.339 182729 DEBUG nova.network.neutron [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.383 182729 DEBUG nova.compute.manager [req-f044fdfe-97bd-4e1d-b9b6-ce36fc43639c req-c0118bc9-7dd1-421f-824d-e1c0b066b972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.383 182729 DEBUG oslo_concurrency.lockutils [req-f044fdfe-97bd-4e1d-b9b6-ce36fc43639c req-c0118bc9-7dd1-421f-824d-e1c0b066b972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.384 182729 DEBUG oslo_concurrency.lockutils [req-f044fdfe-97bd-4e1d-b9b6-ce36fc43639c req-c0118bc9-7dd1-421f-824d-e1c0b066b972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.384 182729 DEBUG oslo_concurrency.lockutils [req-f044fdfe-97bd-4e1d-b9b6-ce36fc43639c req-c0118bc9-7dd1-421f-824d-e1c0b066b972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.384 182729 DEBUG nova.compute.manager [req-f044fdfe-97bd-4e1d-b9b6-ce36fc43639c req-c0118bc9-7dd1-421f-824d-e1c0b066b972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.384 182729 DEBUG nova.compute.manager [req-f044fdfe-97bd-4e1d-b9b6-ce36fc43639c req-c0118bc9-7dd1-421f-824d-e1c0b066b972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-unplugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:30 compute-0 nova_compute[182725]: 2026-01-22 22:36:30.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.680 182729 DEBUG nova.network.neutron [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.706 182729 INFO nova.compute.manager [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Took 1.37 seconds to deallocate network for instance.
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.785 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.785 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.876 182729 DEBUG nova.compute.provider_tree [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.892 182729 DEBUG nova.scheduler.client.report [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.917 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:31 compute-0 nova_compute[182725]: 2026-01-22 22:36:31.943 182729 INFO nova.scheduler.client.report [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Deleted allocations for instance b924048a-36af-45c3-80fd-9400d5975e6a
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.041 182729 DEBUG oslo_concurrency.lockutils [None req-24bf91a2-81ae-45f0-8550-b5d0001fa964 97ae504d8c4f43529c360266766791d0 301c97a097c64afd8d55adb73fdd8cce - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.476 182729 DEBUG nova.compute.manager [req-f39af459-3900-4a9e-bbb3-fa7e36cf7c4f req-da053c0f-5aea-4ba5-baff-bd9795e2cf42 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.476 182729 DEBUG oslo_concurrency.lockutils [req-f39af459-3900-4a9e-bbb3-fa7e36cf7c4f req-da053c0f-5aea-4ba5-baff-bd9795e2cf42 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.476 182729 DEBUG oslo_concurrency.lockutils [req-f39af459-3900-4a9e-bbb3-fa7e36cf7c4f req-da053c0f-5aea-4ba5-baff-bd9795e2cf42 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.477 182729 DEBUG oslo_concurrency.lockutils [req-f39af459-3900-4a9e-bbb3-fa7e36cf7c4f req-da053c0f-5aea-4ba5-baff-bd9795e2cf42 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b924048a-36af-45c3-80fd-9400d5975e6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.477 182729 DEBUG nova.compute.manager [req-f39af459-3900-4a9e-bbb3-fa7e36cf7c4f req-da053c0f-5aea-4ba5-baff-bd9795e2cf42 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] No waiting events found dispatching network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.477 182729 WARNING nova.compute.manager [req-f39af459-3900-4a9e-bbb3-fa7e36cf7c4f req-da053c0f-5aea-4ba5-baff-bd9795e2cf42 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received unexpected event network-vif-plugged-eecbb79f-fdf2-48a6-828b-d9fc528ec81f for instance with vm_state deleted and task_state None.
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.477 182729 DEBUG nova.compute.manager [req-f39af459-3900-4a9e-bbb3-fa7e36cf7c4f req-da053c0f-5aea-4ba5-baff-bd9795e2cf42 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Received event network-vif-deleted-eecbb79f-fdf2-48a6-828b-d9fc528ec81f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:36:32 compute-0 nova_compute[182725]: 2026-01-22 22:36:32.930 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:33 compute-0 nova_compute[182725]: 2026-01-22 22:36:33.777 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121378.7753546, 9dc942b5-8b65-4eb7-a57c-30d0a6221426 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:36:33 compute-0 nova_compute[182725]: 2026-01-22 22:36:33.777 182729 INFO nova.compute.manager [-] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] VM Stopped (Lifecycle Event)
Jan 22 22:36:33 compute-0 nova_compute[182725]: 2026-01-22 22:36:33.815 182729 DEBUG nova.compute.manager [None req-511d09d8-ce98-4d18-93a5-cd112699975d - - - - - -] [instance: 9dc942b5-8b65-4eb7-a57c-30d0a6221426] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:33 compute-0 nova_compute[182725]: 2026-01-22 22:36:33.909 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:34 compute-0 nova_compute[182725]: 2026-01-22 22:36:34.092 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:35 compute-0 nova_compute[182725]: 2026-01-22 22:36:35.266 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:36 compute-0 ovn_controller[94850]: 2026-01-22T22:36:36Z|00450|binding|INFO|Releasing lport 48ec79a0-32c2-47c3-bffb-8836aa917258 from this chassis (sb_readonly=0)
Jan 22 22:36:36 compute-0 nova_compute[182725]: 2026-01-22 22:36:36.606 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:36 compute-0 ovn_controller[94850]: 2026-01-22T22:36:36Z|00451|binding|INFO|Releasing lport 48ec79a0-32c2-47c3-bffb-8836aa917258 from this chassis (sb_readonly=0)
Jan 22 22:36:36 compute-0 nova_compute[182725]: 2026-01-22 22:36:36.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:37 compute-0 podman[228135]: 2026-01-22 22:36:37.146975898 +0000 UTC m=+0.067054235 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:36:37 compute-0 podman[228133]: 2026-01-22 22:36:37.168406233 +0000 UTC m=+0.095126956 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:36:37 compute-0 podman[228134]: 2026-01-22 22:36:37.188159446 +0000 UTC m=+0.105021083 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:36:39 compute-0 nova_compute[182725]: 2026-01-22 22:36:39.096 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:40 compute-0 nova_compute[182725]: 2026-01-22 22:36:40.270 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:44 compute-0 nova_compute[182725]: 2026-01-22 22:36:44.099 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:45 compute-0 nova_compute[182725]: 2026-01-22 22:36:45.236 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121390.2350397, b924048a-36af-45c3-80fd-9400d5975e6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:36:45 compute-0 nova_compute[182725]: 2026-01-22 22:36:45.237 182729 INFO nova.compute.manager [-] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] VM Stopped (Lifecycle Event)
Jan 22 22:36:45 compute-0 nova_compute[182725]: 2026-01-22 22:36:45.265 182729 DEBUG nova.compute.manager [None req-a49c7cd9-c85c-47dc-bd18-7c7610ac29f7 - - - - - -] [instance: b924048a-36af-45c3-80fd-9400d5975e6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:36:45 compute-0 nova_compute[182725]: 2026-01-22 22:36:45.272 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:49 compute-0 nova_compute[182725]: 2026-01-22 22:36:49.100 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:50 compute-0 nova_compute[182725]: 2026-01-22 22:36:50.276 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:52 compute-0 podman[228200]: 2026-01-22 22:36:52.151969142 +0000 UTC m=+0.079253609 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 22:36:54 compute-0 nova_compute[182725]: 2026-01-22 22:36:54.103 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:55 compute-0 nova_compute[182725]: 2026-01-22 22:36:55.279 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:36:57 compute-0 podman[228221]: 2026-01-22 22:36:57.14049188 +0000 UTC m=+0.062281555 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Jan 22 22:36:57 compute-0 podman[228220]: 2026-01-22 22:36:57.170293214 +0000 UTC m=+0.093254088 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:36:59 compute-0 nova_compute[182725]: 2026-01-22 22:36:59.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:00 compute-0 nova_compute[182725]: 2026-01-22 22:37:00.282 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:01 compute-0 anacron[103311]: Job `cron.daily' started
Jan 22 22:37:01 compute-0 anacron[103311]: Job `cron.daily' terminated
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.080 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.081 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.104 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.231 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.232 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.238 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.238 182729 INFO nova.compute.claims [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.388 182729 DEBUG nova.compute.provider_tree [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.402 182729 DEBUG nova.scheduler.client.report [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.424 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.425 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.471 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.471 182729 DEBUG nova.network.neutron [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.487 182729 INFO nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.502 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.596 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.597 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.598 182729 INFO nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Creating image(s)
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.599 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "/var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.599 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.600 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.615 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.675 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.677 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.678 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.693 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.767 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.768 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.803 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.804 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.805 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.828 182729 DEBUG nova.policy [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.858 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.859 182729 DEBUG nova.virt.disk.api [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Checking if we can resize image /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.859 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.936 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.937 182729 DEBUG nova.virt.disk.api [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Cannot resize image /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.937 182729 DEBUG nova.objects.instance [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'migration_context' on Instance uuid bcc5887b-e062-48f9-a7e9-ba802d6426b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.952 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.953 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Ensure instance console log exists: /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.953 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.954 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:02 compute-0 nova_compute[182725]: 2026-01-22 22:37:02.954 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:04 compute-0 nova_compute[182725]: 2026-01-22 22:37:04.107 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:04 compute-0 nova_compute[182725]: 2026-01-22 22:37:04.243 182729 DEBUG nova.network.neutron [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Successfully created port: bc005871-e45a-4122-b554-41bae6f88bd3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:37:05 compute-0 nova_compute[182725]: 2026-01-22 22:37:05.285 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:05 compute-0 nova_compute[182725]: 2026-01-22 22:37:05.957 182729 DEBUG nova.network.neutron [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Successfully updated port: bc005871-e45a-4122-b554-41bae6f88bd3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:37:05 compute-0 nova_compute[182725]: 2026-01-22 22:37:05.972 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "refresh_cache-bcc5887b-e062-48f9-a7e9-ba802d6426b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:37:05 compute-0 nova_compute[182725]: 2026-01-22 22:37:05.972 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquired lock "refresh_cache-bcc5887b-e062-48f9-a7e9-ba802d6426b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:37:05 compute-0 nova_compute[182725]: 2026-01-22 22:37:05.972 182729 DEBUG nova.network.neutron [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:37:06 compute-0 nova_compute[182725]: 2026-01-22 22:37:06.241 182729 DEBUG nova.network.neutron [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.225 182729 DEBUG nova.compute.manager [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received event network-changed-bc005871-e45a-4122-b554-41bae6f88bd3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.226 182729 DEBUG nova.compute.manager [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Refreshing instance network info cache due to event network-changed-bc005871-e45a-4122-b554-41bae6f88bd3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.226 182729 DEBUG oslo_concurrency.lockutils [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bcc5887b-e062-48f9-a7e9-ba802d6426b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.677 182729 DEBUG nova.network.neutron [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Updating instance_info_cache with network_info: [{"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.694 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Releasing lock "refresh_cache-bcc5887b-e062-48f9-a7e9-ba802d6426b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.695 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Instance network_info: |[{"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.695 182729 DEBUG oslo_concurrency.lockutils [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bcc5887b-e062-48f9-a7e9-ba802d6426b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.696 182729 DEBUG nova.network.neutron [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Refreshing network info cache for port bc005871-e45a-4122-b554-41bae6f88bd3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.702 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Start _get_guest_xml network_info=[{"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.706 182729 WARNING nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.711 182729 DEBUG nova.virt.libvirt.host [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.711 182729 DEBUG nova.virt.libvirt.host [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.720 182729 DEBUG nova.virt.libvirt.host [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.721 182729 DEBUG nova.virt.libvirt.host [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.722 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.723 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.723 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.724 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.724 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.724 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.725 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.725 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.725 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.726 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.726 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.726 182729 DEBUG nova.virt.hardware [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.732 182729 DEBUG nova.virt.libvirt.vif [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1567167072',display_name='tempest-ServersTestJSON-server-1567167072',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1567167072',id=120,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-ety479gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:02Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=bcc5887b-e062-48f9-a7e9-ba802d6426b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.732 182729 DEBUG nova.network.os_vif_util [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.733 182729 DEBUG nova.network.os_vif_util [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:30:35,bridge_name='br-int',has_traffic_filtering=True,id=bc005871-e45a-4122-b554-41bae6f88bd3,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc005871-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.734 182729 DEBUG nova.objects.instance [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'pci_devices' on Instance uuid bcc5887b-e062-48f9-a7e9-ba802d6426b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.754 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <uuid>bcc5887b-e062-48f9-a7e9-ba802d6426b5</uuid>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <name>instance-00000078</name>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersTestJSON-server-1567167072</nova:name>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:37:07</nova:creationTime>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:user uuid="10767689cb2d4ee383920e3d388a6dfe">tempest-ServersTestJSON-1676167595-project-member</nova:user>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:project uuid="25a5678696f747b3ac42324626646e40">tempest-ServersTestJSON-1676167595</nova:project>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         <nova:port uuid="bc005871-e45a-4122-b554-41bae6f88bd3">
Jan 22 22:37:07 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <system>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <entry name="serial">bcc5887b-e062-48f9-a7e9-ba802d6426b5</entry>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <entry name="uuid">bcc5887b-e062-48f9-a7e9-ba802d6426b5</entry>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </system>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <os>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   </os>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <features>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   </features>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk.config"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:72:30:35"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <target dev="tapbc005871-e4"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/console.log" append="off"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <video>
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </video>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:37:07 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:37:07 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:37:07 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:37:07 compute-0 nova_compute[182725]: </domain>
Jan 22 22:37:07 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.756 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Preparing to wait for external event network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.756 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.757 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.757 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.758 182729 DEBUG nova.virt.libvirt.vif [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1567167072',display_name='tempest-ServersTestJSON-server-1567167072',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1567167072',id=120,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-ety479gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:02Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=bcc5887b-e062-48f9-a7e9-ba802d6426b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.759 182729 DEBUG nova.network.os_vif_util [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.760 182729 DEBUG nova.network.os_vif_util [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:30:35,bridge_name='br-int',has_traffic_filtering=True,id=bc005871-e45a-4122-b554-41bae6f88bd3,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc005871-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.760 182729 DEBUG os_vif [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:30:35,bridge_name='br-int',has_traffic_filtering=True,id=bc005871-e45a-4122-b554-41bae6f88bd3,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc005871-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.761 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.762 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.762 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.765 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.765 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc005871-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.766 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc005871-e4, col_values=(('external_ids', {'iface-id': 'bc005871-e45a-4122-b554-41bae6f88bd3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:30:35', 'vm-uuid': 'bcc5887b-e062-48f9-a7e9-ba802d6426b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.767 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:07 compute-0 NetworkManager[54954]: <info>  [1769121427.7684] manager: (tapbc005871-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.770 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.773 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.774 182729 INFO os_vif [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:30:35,bridge_name='br-int',has_traffic_filtering=True,id=bc005871-e45a-4122-b554-41bae6f88bd3,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc005871-e4')
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.833 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.835 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.835 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No VIF found with MAC fa:16:3e:72:30:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:37:07 compute-0 nova_compute[182725]: 2026-01-22 22:37:07.835 182729 INFO nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Using config drive
Jan 22 22:37:07 compute-0 podman[228298]: 2026-01-22 22:37:07.858684361 +0000 UTC m=+0.052759258 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:37:07 compute-0 podman[228300]: 2026-01-22 22:37:07.867213414 +0000 UTC m=+0.053399104 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:37:07 compute-0 podman[228299]: 2026-01-22 22:37:07.893625773 +0000 UTC m=+0.084033388 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.566 182729 INFO nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Creating config drive at /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk.config
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.576 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvp1ydb7h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.721 182729 DEBUG oslo_concurrency.processutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvp1ydb7h" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:08 compute-0 kernel: tapbc005871-e4: entered promiscuous mode
Jan 22 22:37:08 compute-0 NetworkManager[54954]: <info>  [1769121428.8104] manager: (tapbc005871-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/214)
Jan 22 22:37:08 compute-0 ovn_controller[94850]: 2026-01-22T22:37:08Z|00452|binding|INFO|Claiming lport bc005871-e45a-4122-b554-41bae6f88bd3 for this chassis.
Jan 22 22:37:08 compute-0 ovn_controller[94850]: 2026-01-22T22:37:08Z|00453|binding|INFO|bc005871-e45a-4122-b554-41bae6f88bd3: Claiming fa:16:3e:72:30:35 10.100.0.4
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.812 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:08 compute-0 ovn_controller[94850]: 2026-01-22T22:37:08Z|00454|binding|INFO|Setting lport bc005871-e45a-4122-b554-41bae6f88bd3 ovn-installed in OVS
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.826 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.828 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.832 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:08 compute-0 ovn_controller[94850]: 2026-01-22T22:37:08Z|00455|binding|INFO|Setting lport bc005871-e45a-4122-b554-41bae6f88bd3 up in Southbound
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.835 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:30:35 10.100.0.4'], port_security=['fa:16:3e:72:30:35 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bcc5887b-e062-48f9-a7e9-ba802d6426b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=bc005871-e45a-4122-b554-41bae6f88bd3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.836 104215 INFO neutron.agent.ovn.metadata.agent [-] Port bc005871-e45a-4122-b554-41bae6f88bd3 in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 bound to our chassis
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.838 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.856 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a044d781-a4ed-4612-8aa9-c5981808a4ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:08 compute-0 systemd-machined[154006]: New machine qemu-53-instance-00000078.
Jan 22 22:37:08 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-00000078.
Jan 22 22:37:08 compute-0 systemd-udevd[228382]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.893 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[215fb59d-efa3-4e8e-aba3-b8c91ffae3f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.897 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3c843a-c513-4ccf-91b2-72c2ceb5f489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:08 compute-0 NetworkManager[54954]: <info>  [1769121428.9024] device (tapbc005871-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:37:08 compute-0 NetworkManager[54954]: <info>  [1769121428.9032] device (tapbc005871-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.930 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[cb42ac0a-f140-4024-abcb-327fbe8ff5bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.959 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ece34da9-2559-4eea-aa7f-13d2983c3d83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498330, 'reachable_time': 23825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228392, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.978 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8e8140f4-c300-45ba-8497-a4ad2b93b991]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498346, 'tstamp': 498346}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228394, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498349, 'tstamp': 498349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228394, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.981 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.984 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:08 compute-0 nova_compute[182725]: 2026-01-22 22:37:08.985 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.986 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cfbdc2a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.986 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.987 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cfbdc2a-d0, col_values=(('external_ids', {'iface-id': '48ec79a0-32c2-47c3-bffb-8836aa917258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:08.988 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.109 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.246 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121429.2443664, bcc5887b-e062-48f9-a7e9-ba802d6426b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.246 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] VM Started (Lifecycle Event)
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.274 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.280 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121429.2448785, bcc5887b-e062-48f9-a7e9-ba802d6426b5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.281 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] VM Paused (Lifecycle Event)
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.308 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.313 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.319 182729 DEBUG nova.compute.manager [req-1cf2700a-ab41-479a-83ed-a66b3ee2975f req-996707bd-7965-4d2d-87af-71c9b7f7c3e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received event network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.320 182729 DEBUG oslo_concurrency.lockutils [req-1cf2700a-ab41-479a-83ed-a66b3ee2975f req-996707bd-7965-4d2d-87af-71c9b7f7c3e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.321 182729 DEBUG oslo_concurrency.lockutils [req-1cf2700a-ab41-479a-83ed-a66b3ee2975f req-996707bd-7965-4d2d-87af-71c9b7f7c3e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.321 182729 DEBUG oslo_concurrency.lockutils [req-1cf2700a-ab41-479a-83ed-a66b3ee2975f req-996707bd-7965-4d2d-87af-71c9b7f7c3e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.321 182729 DEBUG nova.compute.manager [req-1cf2700a-ab41-479a-83ed-a66b3ee2975f req-996707bd-7965-4d2d-87af-71c9b7f7c3e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Processing event network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.322 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.328 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.332 182729 INFO nova.virt.libvirt.driver [-] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Instance spawned successfully.
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.333 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.338 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.338 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121429.3265538, bcc5887b-e062-48f9-a7e9-ba802d6426b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.338 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] VM Resumed (Lifecycle Event)
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.359 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.368 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.369 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.371 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.373 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.374 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.375 182729 DEBUG nova.virt.libvirt.driver [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.382 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.407 182729 DEBUG nova.network.neutron [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Updated VIF entry in instance network info cache for port bc005871-e45a-4122-b554-41bae6f88bd3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.408 182729 DEBUG nova.network.neutron [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Updating instance_info_cache with network_info: [{"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.418 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.430 182729 DEBUG oslo_concurrency.lockutils [req-fa855b71-b85c-4859-a103-6a9e3b0a7be9 req-b2eca953-b644-4157-a445-ff6e0ae9c21c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bcc5887b-e062-48f9-a7e9-ba802d6426b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.465 182729 INFO nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Took 6.87 seconds to spawn the instance on the hypervisor.
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.466 182729 DEBUG nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.569 182729 INFO nova.compute.manager [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Took 7.38 seconds to build instance.
Jan 22 22:37:09 compute-0 nova_compute[182725]: 2026-01-22 22:37:09.590 182729 DEBUG oslo_concurrency.lockutils [None req-f272543b-deed-4f49-907a-67dba96b3d0f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:11 compute-0 nova_compute[182725]: 2026-01-22 22:37:11.439 182729 DEBUG nova.compute.manager [req-fc1729e8-d3e0-4ae6-9e70-c1a157698b76 req-0be2a1f2-dad2-4991-8ea7-10f62b7e9a00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received event network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:11 compute-0 nova_compute[182725]: 2026-01-22 22:37:11.439 182729 DEBUG oslo_concurrency.lockutils [req-fc1729e8-d3e0-4ae6-9e70-c1a157698b76 req-0be2a1f2-dad2-4991-8ea7-10f62b7e9a00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:11 compute-0 nova_compute[182725]: 2026-01-22 22:37:11.440 182729 DEBUG oslo_concurrency.lockutils [req-fc1729e8-d3e0-4ae6-9e70-c1a157698b76 req-0be2a1f2-dad2-4991-8ea7-10f62b7e9a00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:11 compute-0 nova_compute[182725]: 2026-01-22 22:37:11.440 182729 DEBUG oslo_concurrency.lockutils [req-fc1729e8-d3e0-4ae6-9e70-c1a157698b76 req-0be2a1f2-dad2-4991-8ea7-10f62b7e9a00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:11 compute-0 nova_compute[182725]: 2026-01-22 22:37:11.440 182729 DEBUG nova.compute.manager [req-fc1729e8-d3e0-4ae6-9e70-c1a157698b76 req-0be2a1f2-dad2-4991-8ea7-10f62b7e9a00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] No waiting events found dispatching network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:11 compute-0 nova_compute[182725]: 2026-01-22 22:37:11.441 182729 WARNING nova.compute.manager [req-fc1729e8-d3e0-4ae6-9e70-c1a157698b76 req-0be2a1f2-dad2-4991-8ea7-10f62b7e9a00 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received unexpected event network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 for instance with vm_state active and task_state None.
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.445 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.446 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.446 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.769 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.790 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.791 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.791 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.791 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.791 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.804 182729 INFO nova.compute.manager [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Terminating instance
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.814 182729 DEBUG nova.compute.manager [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:37:12 compute-0 kernel: tapbc005871-e4 (unregistering): left promiscuous mode
Jan 22 22:37:12 compute-0 NetworkManager[54954]: <info>  [1769121432.8333] device (tapbc005871-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:37:12 compute-0 ovn_controller[94850]: 2026-01-22T22:37:12Z|00456|binding|INFO|Releasing lport bc005871-e45a-4122-b554-41bae6f88bd3 from this chassis (sb_readonly=0)
Jan 22 22:37:12 compute-0 ovn_controller[94850]: 2026-01-22T22:37:12Z|00457|binding|INFO|Setting lport bc005871-e45a-4122-b554-41bae6f88bd3 down in Southbound
Jan 22 22:37:12 compute-0 ovn_controller[94850]: 2026-01-22T22:37:12Z|00458|binding|INFO|Removing iface tapbc005871-e4 ovn-installed in OVS
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.847 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.848 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.859 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:30:35 10.100.0.4'], port_security=['fa:16:3e:72:30:35 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bcc5887b-e062-48f9-a7e9-ba802d6426b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=bc005871-e45a-4122-b554-41bae6f88bd3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:37:12 compute-0 nova_compute[182725]: 2026-01-22 22:37:12.860 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.861 104215 INFO neutron.agent.ovn.metadata.agent [-] Port bc005871-e45a-4122-b554-41bae6f88bd3 in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 unbound from our chassis
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.862 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.878 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[11c5d385-01e1-4de9-a5f1-b20067ecef1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:12 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 22 22:37:12 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000078.scope: Consumed 3.896s CPU time.
Jan 22 22:37:12 compute-0 systemd-machined[154006]: Machine qemu-53-instance-00000078 terminated.
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.916 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d43445e8-1738-4f29-a495-585c5202d851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.920 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c1036332-8622-45a6-bd20-0b1f71ca1125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.955 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[faa9bd3a-4d66-480f-818a-9631f17fc298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:12.976 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9be09b39-8bc5-4af1-95da-db85ef54caae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498330, 'reachable_time': 23825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228413, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:13.000 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a7df7874-a356-4358-9b52-7c29642475af]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498346, 'tstamp': 498346}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228414, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498349, 'tstamp': 498349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228414, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:13.001 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.003 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.008 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:13.008 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cfbdc2a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:13.008 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:13.009 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cfbdc2a-d0, col_values=(('external_ids', {'iface-id': '48ec79a0-32c2-47c3-bffb-8836aa917258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:13.009 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.090 182729 INFO nova.virt.libvirt.driver [-] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Instance destroyed successfully.
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.091 182729 DEBUG nova.objects.instance [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'resources' on Instance uuid bcc5887b-e062-48f9-a7e9-ba802d6426b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.105 182729 DEBUG nova.virt.libvirt.vif [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:37:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1567167072',display_name='tempest-ServersTestJSON-server-1567167072',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1567167072',id=120,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:37:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-ety479gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:37:11Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=bcc5887b-e062-48f9-a7e9-ba802d6426b5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.106 182729 DEBUG nova.network.os_vif_util [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "bc005871-e45a-4122-b554-41bae6f88bd3", "address": "fa:16:3e:72:30:35", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc005871-e4", "ovs_interfaceid": "bc005871-e45a-4122-b554-41bae6f88bd3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.107 182729 DEBUG nova.network.os_vif_util [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:30:35,bridge_name='br-int',has_traffic_filtering=True,id=bc005871-e45a-4122-b554-41bae6f88bd3,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc005871-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.107 182729 DEBUG os_vif [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:30:35,bridge_name='br-int',has_traffic_filtering=True,id=bc005871-e45a-4122-b554-41bae6f88bd3,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc005871-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.111 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.111 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc005871-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.114 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.117 182729 INFO os_vif [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:30:35,bridge_name='br-int',has_traffic_filtering=True,id=bc005871-e45a-4122-b554-41bae6f88bd3,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc005871-e4')
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.117 182729 INFO nova.virt.libvirt.driver [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Deleting instance files /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5_del
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.118 182729 INFO nova.virt.libvirt.driver [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Deletion of /var/lib/nova/instances/bcc5887b-e062-48f9-a7e9-ba802d6426b5_del complete
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.207 182729 INFO nova.compute.manager [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.208 182729 DEBUG oslo.service.loopingcall [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.208 182729 DEBUG nova.compute.manager [-] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.208 182729 DEBUG nova.network.neutron [-] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.975 182729 DEBUG nova.compute.manager [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received event network-vif-unplugged-bc005871-e45a-4122-b554-41bae6f88bd3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.976 182729 DEBUG oslo_concurrency.lockutils [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.976 182729 DEBUG oslo_concurrency.lockutils [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.976 182729 DEBUG oslo_concurrency.lockutils [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.977 182729 DEBUG nova.compute.manager [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] No waiting events found dispatching network-vif-unplugged-bc005871-e45a-4122-b554-41bae6f88bd3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.977 182729 DEBUG nova.compute.manager [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received event network-vif-unplugged-bc005871-e45a-4122-b554-41bae6f88bd3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.977 182729 DEBUG nova.compute.manager [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received event network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.977 182729 DEBUG oslo_concurrency.lockutils [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.978 182729 DEBUG oslo_concurrency.lockutils [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.978 182729 DEBUG oslo_concurrency.lockutils [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.978 182729 DEBUG nova.compute.manager [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] No waiting events found dispatching network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:13 compute-0 nova_compute[182725]: 2026-01-22 22:37:13.978 182729 WARNING nova.compute.manager [req-2ae88261-60be-494f-9c2a-ba30a61e37fb req-3c47ed3c-ef71-4a67-b331-7b8d613bdb3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received unexpected event network-vif-plugged-bc005871-e45a-4122-b554-41bae6f88bd3 for instance with vm_state active and task_state deleting.
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.112 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.634 182729 DEBUG nova.network.neutron [-] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.667 182729 INFO nova.compute.manager [-] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Took 1.46 seconds to deallocate network for instance.
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.746 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.747 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.821 182729 DEBUG nova.compute.provider_tree [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.835 182729 DEBUG nova.scheduler.client.report [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.857 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.886 182729 INFO nova.scheduler.client.report [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Deleted allocations for instance bcc5887b-e062-48f9-a7e9-ba802d6426b5
Jan 22 22:37:14 compute-0 nova_compute[182725]: 2026-01-22 22:37:14.964 182729 DEBUG oslo_concurrency.lockutils [None req-354c5fe3-8772-47ec-86f3-03827f68cef5 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "bcc5887b-e062-48f9-a7e9-ba802d6426b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:16 compute-0 nova_compute[182725]: 2026-01-22 22:37:16.119 182729 DEBUG nova.compute.manager [req-5ac8ee8f-fbc3-45fc-b5b8-23806cf5fe38 req-b0a9d421-0996-4520-901f-0e1d01bb11e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Received event network-vif-deleted-bc005871-e45a-4122-b554-41bae6f88bd3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:16.713 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:37:16 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:16.714 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:37:16 compute-0 nova_compute[182725]: 2026-01-22 22:37:16.714 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:17 compute-0 nova_compute[182725]: 2026-01-22 22:37:17.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.114 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.861 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.861 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.891 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.988 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.988 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.993 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:37:18 compute-0 nova_compute[182725]: 2026-01-22 22:37:18.993 182729 INFO nova.compute.claims [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.114 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.121 182729 DEBUG nova.compute.provider_tree [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.135 182729 DEBUG nova.scheduler.client.report [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.155 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.156 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.224 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.225 182729 DEBUG nova.network.neutron [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.245 182729 INFO nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.266 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.402 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.403 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.403 182729 INFO nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Creating image(s)
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.405 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "/var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.406 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.406 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.419 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.475 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.476 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.477 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.487 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.541 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.542 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.601 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.602 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.602 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.678 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.679 182729 DEBUG nova.virt.disk.api [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Checking if we can resize image /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.679 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.736 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.737 182729 DEBUG nova.virt.disk.api [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Cannot resize image /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.737 182729 DEBUG nova.objects.instance [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'migration_context' on Instance uuid 1297d9cc-74e6-4d32-b1c2-126898473271 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.753 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.754 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Ensure instance console log exists: /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.755 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.755 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.755 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:19 compute-0 nova_compute[182725]: 2026-01-22 22:37:19.775 182729 DEBUG nova.policy [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:37:20 compute-0 nova_compute[182725]: 2026-01-22 22:37:20.930 182729 DEBUG nova.network.neutron [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Successfully created port: bedba378-9c7e-4b0d-b3ca-3111918e5cff _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.747 182729 DEBUG nova.network.neutron [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Successfully updated port: bedba378-9c7e-4b0d-b3ca-3111918e5cff _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.778 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "refresh_cache-1297d9cc-74e6-4d32-b1c2-126898473271" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.778 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquired lock "refresh_cache-1297d9cc-74e6-4d32-b1c2-126898473271" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.779 182729 DEBUG nova.network.neutron [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.907 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 22:37:21 compute-0 nova_compute[182725]: 2026-01-22 22:37:21.912 182729 DEBUG nova.network.neutron [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:37:22 compute-0 nova_compute[182725]: 2026-01-22 22:37:22.065 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:37:22 compute-0 nova_compute[182725]: 2026-01-22 22:37:22.066 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:37:22 compute-0 nova_compute[182725]: 2026-01-22 22:37:22.066 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:37:22 compute-0 nova_compute[182725]: 2026-01-22 22:37:22.066 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 49096ce9-494c-4c84-b263-86a05230d8af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:22 compute-0 nova_compute[182725]: 2026-01-22 22:37:22.413 182729 DEBUG nova.compute.manager [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received event network-changed-bedba378-9c7e-4b0d-b3ca-3111918e5cff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:22 compute-0 nova_compute[182725]: 2026-01-22 22:37:22.414 182729 DEBUG nova.compute.manager [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Refreshing instance network info cache due to event network-changed-bedba378-9c7e-4b0d-b3ca-3111918e5cff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:37:22 compute-0 nova_compute[182725]: 2026-01-22 22:37:22.414 182729 DEBUG oslo_concurrency.lockutils [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1297d9cc-74e6-4d32-b1c2-126898473271" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.116 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:23 compute-0 podman[228448]: 2026-01-22 22:37:23.137696486 +0000 UTC m=+0.065179108 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.188 182729 DEBUG nova.network.neutron [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Updating instance_info_cache with network_info: [{"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.211 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Releasing lock "refresh_cache-1297d9cc-74e6-4d32-b1c2-126898473271" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.211 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance network_info: |[{"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.212 182729 DEBUG oslo_concurrency.lockutils [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1297d9cc-74e6-4d32-b1c2-126898473271" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.212 182729 DEBUG nova.network.neutron [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Refreshing network info cache for port bedba378-9c7e-4b0d-b3ca-3111918e5cff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.214 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Start _get_guest_xml network_info=[{"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.218 182729 WARNING nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.222 182729 DEBUG nova.virt.libvirt.host [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.223 182729 DEBUG nova.virt.libvirt.host [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.228 182729 DEBUG nova.virt.libvirt.host [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.228 182729 DEBUG nova.virt.libvirt.host [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.229 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.229 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.230 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.230 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.230 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.230 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.230 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.231 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.231 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.231 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.231 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.232 182729 DEBUG nova.virt.hardware [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.235 182729 DEBUG nova.virt.libvirt.vif [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1612502526',display_name='tempest-ServersTestJSON-server-1612502526',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1612502526',id=121,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-exit6fp1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:19Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=1297d9cc-74e6-4d32-b1c2-126898473271,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.235 182729 DEBUG nova.network.os_vif_util [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.236 182729 DEBUG nova.network.os_vif_util [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:bb:e8,bridge_name='br-int',has_traffic_filtering=True,id=bedba378-9c7e-4b0d-b3ca-3111918e5cff,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbedba378-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.237 182729 DEBUG nova.objects.instance [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1297d9cc-74e6-4d32-b1c2-126898473271 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.251 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <uuid>1297d9cc-74e6-4d32-b1c2-126898473271</uuid>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <name>instance-00000079</name>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersTestJSON-server-1612502526</nova:name>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:37:23</nova:creationTime>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:user uuid="10767689cb2d4ee383920e3d388a6dfe">tempest-ServersTestJSON-1676167595-project-member</nova:user>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:project uuid="25a5678696f747b3ac42324626646e40">tempest-ServersTestJSON-1676167595</nova:project>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         <nova:port uuid="bedba378-9c7e-4b0d-b3ca-3111918e5cff">
Jan 22 22:37:23 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <system>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <entry name="serial">1297d9cc-74e6-4d32-b1c2-126898473271</entry>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <entry name="uuid">1297d9cc-74e6-4d32-b1c2-126898473271</entry>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </system>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <os>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   </os>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <features>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   </features>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk.config"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:81:bb:e8"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <target dev="tapbedba378-9c"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/console.log" append="off"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <video>
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </video>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:37:23 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:37:23 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:37:23 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:37:23 compute-0 nova_compute[182725]: </domain>
Jan 22 22:37:23 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.252 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Preparing to wait for external event network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.253 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.253 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.253 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.254 182729 DEBUG nova.virt.libvirt.vif [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1612502526',display_name='tempest-ServersTestJSON-server-1612502526',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1612502526',id=121,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-exit6fp1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:19Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=1297d9cc-74e6-4d32-b1c2-126898473271,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.254 182729 DEBUG nova.network.os_vif_util [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.255 182729 DEBUG nova.network.os_vif_util [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:bb:e8,bridge_name='br-int',has_traffic_filtering=True,id=bedba378-9c7e-4b0d-b3ca-3111918e5cff,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbedba378-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.255 182729 DEBUG os_vif [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:bb:e8,bridge_name='br-int',has_traffic_filtering=True,id=bedba378-9c7e-4b0d-b3ca-3111918e5cff,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbedba378-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.256 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.256 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.256 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.259 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.259 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbedba378-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.260 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbedba378-9c, col_values=(('external_ids', {'iface-id': 'bedba378-9c7e-4b0d-b3ca-3111918e5cff', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:bb:e8', 'vm-uuid': '1297d9cc-74e6-4d32-b1c2-126898473271'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.262 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:23 compute-0 NetworkManager[54954]: <info>  [1769121443.2628] manager: (tapbedba378-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.264 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.268 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.269 182729 INFO os_vif [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:bb:e8,bridge_name='br-int',has_traffic_filtering=True,id=bedba378-9c7e-4b0d-b3ca-3111918e5cff,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbedba378-9c')
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.334 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.334 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.335 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No VIF found with MAC fa:16:3e:81:bb:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:37:23 compute-0 nova_compute[182725]: 2026-01-22 22:37:23.335 182729 INFO nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Using config drive
Jan 22 22:37:23 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:23.716 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.116 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.157 182729 INFO nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Creating config drive at /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk.config
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.162 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeklr74t0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.218 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Updating instance_info_cache with network_info: [{"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.246 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-49096ce9-494c-4c84-b263-86a05230d8af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.247 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.247 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.278 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.278 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.279 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.279 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.292 182729 DEBUG oslo_concurrency.processutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeklr74t0" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:24 compute-0 kernel: tapbedba378-9c: entered promiscuous mode
Jan 22 22:37:24 compute-0 NetworkManager[54954]: <info>  [1769121444.3559] manager: (tapbedba378-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Jan 22 22:37:24 compute-0 ovn_controller[94850]: 2026-01-22T22:37:24Z|00459|binding|INFO|Claiming lport bedba378-9c7e-4b0d-b3ca-3111918e5cff for this chassis.
Jan 22 22:37:24 compute-0 ovn_controller[94850]: 2026-01-22T22:37:24Z|00460|binding|INFO|bedba378-9c7e-4b0d-b3ca-3111918e5cff: Claiming fa:16:3e:81:bb:e8 10.100.0.6
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.359 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.366 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:bb:e8 10.100.0.6'], port_security=['fa:16:3e:81:bb:e8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1297d9cc-74e6-4d32-b1c2-126898473271', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=bedba378-9c7e-4b0d-b3ca-3111918e5cff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.367 104215 INFO neutron.agent.ovn.metadata.agent [-] Port bedba378-9c7e-4b0d-b3ca-3111918e5cff in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 bound to our chassis
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.369 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 22:37:24 compute-0 systemd-udevd[228488]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:37:24 compute-0 ovn_controller[94850]: 2026-01-22T22:37:24Z|00461|binding|INFO|Setting lport bedba378-9c7e-4b0d-b3ca-3111918e5cff ovn-installed in OVS
Jan 22 22:37:24 compute-0 ovn_controller[94850]: 2026-01-22T22:37:24Z|00462|binding|INFO|Setting lport bedba378-9c7e-4b0d-b3ca-3111918e5cff up in Southbound
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.386 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.389 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.388 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[eb15eb96-16cd-49fd-bb2a-85c4b9436537]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:24 compute-0 NetworkManager[54954]: <info>  [1769121444.3984] device (tapbedba378-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:37:24 compute-0 NetworkManager[54954]: <info>  [1769121444.3990] device (tapbedba378-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:37:24 compute-0 systemd-machined[154006]: New machine qemu-54-instance-00000079.
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.422 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[dff79a37-2d4d-4507-9929-ff0eb14351d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.425 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9988d562-9e03-4f0e-a939-d8b86d47811b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:24 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-00000079.
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.459 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[73b470ef-0a38-4d8a-85a8-21adbf1d69ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.479 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0e42e929-e7d2-417a-b23f-68ee7dd792af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498330, 'reachable_time': 23825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228503, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.497 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[18c9613a-6657-48b2-aad0-a71dde669a7e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498346, 'tstamp': 498346}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228504, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498349, 'tstamp': 498349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228504, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.499 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.501 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.502 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.503 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cfbdc2a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.503 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.504 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cfbdc2a-d0, col_values=(('external_ids', {'iface-id': '48ec79a0-32c2-47c3-bffb-8836aa917258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:24.504 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.510 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.581 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.582 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.655 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.663 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.691 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121444.691376, 1297d9cc-74e6-4d32-b1c2-126898473271 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.692 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] VM Started (Lifecycle Event)
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.715 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.716 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.736 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.742 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121444.6915894, 1297d9cc-74e6-4d32-b1c2-126898473271 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.742 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] VM Paused (Lifecycle Event)
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.765 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.767 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.773 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.793 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.948 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.950 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5501MB free_disk=73.30346298217773GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.950 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:24 compute-0 nova_compute[182725]: 2026-01-22 22:37:24.950 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.083 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 49096ce9-494c-4c84-b263-86a05230d8af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.083 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 1297d9cc-74e6-4d32-b1c2-126898473271 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.084 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.084 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.142 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.155 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.176 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.176 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.366 182729 DEBUG nova.network.neutron [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Updated VIF entry in instance network info cache for port bedba378-9c7e-4b0d-b3ca-3111918e5cff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.367 182729 DEBUG nova.network.neutron [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Updating instance_info_cache with network_info: [{"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.382 182729 DEBUG oslo_concurrency.lockutils [req-d3933029-ef78-4d77-8b13-4f101a91a848 req-3c752a0a-92ce-4fa8-b80f-41fbf9a4182f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1297d9cc-74e6-4d32-b1c2-126898473271" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.817 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:25 compute-0 nova_compute[182725]: 2026-01-22 22:37:25.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.709 182729 DEBUG nova.compute.manager [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received event network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.709 182729 DEBUG oslo_concurrency.lockutils [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.710 182729 DEBUG oslo_concurrency.lockutils [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.710 182729 DEBUG oslo_concurrency.lockutils [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.710 182729 DEBUG nova.compute.manager [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Processing event network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.710 182729 DEBUG nova.compute.manager [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received event network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.710 182729 DEBUG oslo_concurrency.lockutils [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.711 182729 DEBUG oslo_concurrency.lockutils [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.711 182729 DEBUG oslo_concurrency.lockutils [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.711 182729 DEBUG nova.compute.manager [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] No waiting events found dispatching network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.711 182729 WARNING nova.compute.manager [req-93570981-bdf6-4a9d-81a3-cfe92f581777 req-c266bb02-b571-4ac4-9a34-5d5876f20787 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received unexpected event network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff for instance with vm_state building and task_state spawning.
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.712 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.716 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.717 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121446.7171042, 1297d9cc-74e6-4d32-b1c2-126898473271 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.717 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] VM Resumed (Lifecycle Event)
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.722 182729 INFO nova.virt.libvirt.driver [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance spawned successfully.
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.723 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.755 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.761 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.765 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.765 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.766 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.766 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.767 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.767 182729 DEBUG nova.virt.libvirt.driver [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.801 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.850 182729 INFO nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Took 7.45 seconds to spawn the instance on the hypervisor.
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.851 182729 DEBUG nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.938 182729 INFO nova.compute.manager [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Took 7.98 seconds to build instance.
Jan 22 22:37:26 compute-0 nova_compute[182725]: 2026-01-22 22:37:26.955 182729 DEBUG oslo_concurrency.lockutils [None req-9c4533eb-03a4-4b27-807d-4d73b8600fcc 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:27 compute-0 nova_compute[182725]: 2026-01-22 22:37:27.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:27 compute-0 nova_compute[182725]: 2026-01-22 22:37:27.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:37:28 compute-0 nova_compute[182725]: 2026-01-22 22:37:28.088 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121433.0876753, bcc5887b-e062-48f9-a7e9-ba802d6426b5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:28 compute-0 nova_compute[182725]: 2026-01-22 22:37:28.089 182729 INFO nova.compute.manager [-] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] VM Stopped (Lifecycle Event)
Jan 22 22:37:28 compute-0 nova_compute[182725]: 2026-01-22 22:37:28.108 182729 DEBUG nova.compute.manager [None req-1730f887-15c0-47b1-9de9-10000cd23723 - - - - - -] [instance: bcc5887b-e062-48f9-a7e9-ba802d6426b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:28 compute-0 podman[228527]: 2026-01-22 22:37:28.129449838 +0000 UTC m=+0.060714377 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git)
Jan 22 22:37:28 compute-0 podman[228526]: 2026-01-22 22:37:28.160915833 +0000 UTC m=+0.093665489 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:37:28 compute-0 nova_compute[182725]: 2026-01-22 22:37:28.261 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.118 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.549 182729 DEBUG oslo_concurrency.lockutils [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.549 182729 DEBUG oslo_concurrency.lockutils [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.550 182729 DEBUG nova.compute.manager [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.552 182729 DEBUG nova.compute.manager [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.553 182729 DEBUG nova.objects.instance [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'flavor' on Instance uuid 1297d9cc-74e6-4d32-b1c2-126898473271 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.585 182729 DEBUG nova.objects.instance [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'info_cache' on Instance uuid 1297d9cc-74e6-4d32-b1c2-126898473271 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:29 compute-0 nova_compute[182725]: 2026-01-22 22:37:29.613 182729 DEBUG nova.virt.libvirt.driver [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:37:31 compute-0 nova_compute[182725]: 2026-01-22 22:37:31.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:32 compute-0 nova_compute[182725]: 2026-01-22 22:37:32.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:33 compute-0 nova_compute[182725]: 2026-01-22 22:37:33.267 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:34 compute-0 nova_compute[182725]: 2026-01-22 22:37:34.120 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:38 compute-0 podman[228588]: 2026-01-22 22:37:38.131628345 +0000 UTC m=+0.060333457 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:37:38 compute-0 podman[228590]: 2026-01-22 22:37:38.137198284 +0000 UTC m=+0.056376598 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:37:38 compute-0 podman[228589]: 2026-01-22 22:37:38.14545595 +0000 UTC m=+0.068560333 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 22:37:38 compute-0 nova_compute[182725]: 2026-01-22 22:37:38.269 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:38 compute-0 nova_compute[182725]: 2026-01-22 22:37:38.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:37:39 compute-0 ovn_controller[94850]: 2026-01-22T22:37:39Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:81:bb:e8 10.100.0.6
Jan 22 22:37:39 compute-0 ovn_controller[94850]: 2026-01-22T22:37:39Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:81:bb:e8 10.100.0.6
Jan 22 22:37:39 compute-0 nova_compute[182725]: 2026-01-22 22:37:39.122 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:39 compute-0 nova_compute[182725]: 2026-01-22 22:37:39.667 182729 DEBUG nova.virt.libvirt.driver [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 22:37:41 compute-0 kernel: tapbedba378-9c (unregistering): left promiscuous mode
Jan 22 22:37:41 compute-0 NetworkManager[54954]: <info>  [1769121461.9479] device (tapbedba378-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:37:41 compute-0 ovn_controller[94850]: 2026-01-22T22:37:41Z|00463|binding|INFO|Releasing lport bedba378-9c7e-4b0d-b3ca-3111918e5cff from this chassis (sb_readonly=0)
Jan 22 22:37:41 compute-0 ovn_controller[94850]: 2026-01-22T22:37:41Z|00464|binding|INFO|Setting lport bedba378-9c7e-4b0d-b3ca-3111918e5cff down in Southbound
Jan 22 22:37:41 compute-0 ovn_controller[94850]: 2026-01-22T22:37:41Z|00465|binding|INFO|Removing iface tapbedba378-9c ovn-installed in OVS
Jan 22 22:37:41 compute-0 nova_compute[182725]: 2026-01-22 22:37:41.953 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:41 compute-0 nova_compute[182725]: 2026-01-22 22:37:41.956 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:41.963 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:bb:e8 10.100.0.6'], port_security=['fa:16:3e:81:bb:e8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1297d9cc-74e6-4d32-b1c2-126898473271', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=bedba378-9c7e-4b0d-b3ca-3111918e5cff) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:37:41 compute-0 nova_compute[182725]: 2026-01-22 22:37:41.966 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:41.968 104215 INFO neutron.agent.ovn.metadata.agent [-] Port bedba378-9c7e-4b0d-b3ca-3111918e5cff in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 unbound from our chassis
Jan 22 22:37:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:41.971 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 22:37:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:41.997 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f29564d9-0390-4051-920b-3b0d8da8d86f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:42 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000079.scope: Deactivated successfully.
Jan 22 22:37:42 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000079.scope: Consumed 12.088s CPU time.
Jan 22 22:37:42 compute-0 systemd-machined[154006]: Machine qemu-54-instance-00000079 terminated.
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.044 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[98edb33c-91e4-41ef-990a-159a9ac780da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.050 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fb916cf0-0758-4f03-9b84-d5454ae8b36e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.084 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[72c9f181-a2b3-494f-aa8e-a365131344fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.109 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5c9942fc-402f-4359-93ed-e276d440ddd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498330, 'reachable_time': 23825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228665, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.130 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cd48c6ee-243d-4969-a5b5-e61e0e1299c2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498346, 'tstamp': 498346}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228666, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8cfbdc2a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498349, 'tstamp': 498349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228666, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.132 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.135 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.139 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.140 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cfbdc2a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.141 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.141 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cfbdc2a-d0, col_values=(('external_ids', {'iface-id': '48ec79a0-32c2-47c3-bffb-8836aa917258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:42.142 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.244 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.252 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.330 182729 DEBUG nova.compute.manager [req-b6f5f33d-8544-42b6-97c8-bae728a335f9 req-95c8b969-950e-4d53-9c91-7f84ddab17c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received event network-vif-unplugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.331 182729 DEBUG oslo_concurrency.lockutils [req-b6f5f33d-8544-42b6-97c8-bae728a335f9 req-95c8b969-950e-4d53-9c91-7f84ddab17c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.331 182729 DEBUG oslo_concurrency.lockutils [req-b6f5f33d-8544-42b6-97c8-bae728a335f9 req-95c8b969-950e-4d53-9c91-7f84ddab17c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.332 182729 DEBUG oslo_concurrency.lockutils [req-b6f5f33d-8544-42b6-97c8-bae728a335f9 req-95c8b969-950e-4d53-9c91-7f84ddab17c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.332 182729 DEBUG nova.compute.manager [req-b6f5f33d-8544-42b6-97c8-bae728a335f9 req-95c8b969-950e-4d53-9c91-7f84ddab17c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] No waiting events found dispatching network-vif-unplugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.332 182729 WARNING nova.compute.manager [req-b6f5f33d-8544-42b6-97c8-bae728a335f9 req-95c8b969-950e-4d53-9c91-7f84ddab17c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received unexpected event network-vif-unplugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff for instance with vm_state active and task_state powering-off.
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.684 182729 INFO nova.virt.libvirt.driver [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance shutdown successfully after 13 seconds.
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.694 182729 INFO nova.virt.libvirt.driver [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance destroyed successfully.
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.695 182729 DEBUG nova.objects.instance [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1297d9cc-74e6-4d32-b1c2-126898473271 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.745 182729 DEBUG nova.compute.manager [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:42 compute-0 nova_compute[182725]: 2026-01-22 22:37:42.892 182729 DEBUG oslo_concurrency.lockutils [None req-9e440569-fb3f-4457-a3f8-69bc9e8821f7 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:43 compute-0 nova_compute[182725]: 2026-01-22 22:37:43.272 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:44 compute-0 nova_compute[182725]: 2026-01-22 22:37:44.124 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:44 compute-0 nova_compute[182725]: 2026-01-22 22:37:44.425 182729 DEBUG nova.compute.manager [req-f866e8de-f342-4bea-8d69-a27dc20c2071 req-f6e7a66e-92ae-4408-898e-52e1f6477721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received event network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:44 compute-0 nova_compute[182725]: 2026-01-22 22:37:44.426 182729 DEBUG oslo_concurrency.lockutils [req-f866e8de-f342-4bea-8d69-a27dc20c2071 req-f6e7a66e-92ae-4408-898e-52e1f6477721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:44 compute-0 nova_compute[182725]: 2026-01-22 22:37:44.427 182729 DEBUG oslo_concurrency.lockutils [req-f866e8de-f342-4bea-8d69-a27dc20c2071 req-f6e7a66e-92ae-4408-898e-52e1f6477721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:44 compute-0 nova_compute[182725]: 2026-01-22 22:37:44.427 182729 DEBUG oslo_concurrency.lockutils [req-f866e8de-f342-4bea-8d69-a27dc20c2071 req-f6e7a66e-92ae-4408-898e-52e1f6477721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:44 compute-0 nova_compute[182725]: 2026-01-22 22:37:44.428 182729 DEBUG nova.compute.manager [req-f866e8de-f342-4bea-8d69-a27dc20c2071 req-f6e7a66e-92ae-4408-898e-52e1f6477721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] No waiting events found dispatching network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:44 compute-0 nova_compute[182725]: 2026-01-22 22:37:44.428 182729 WARNING nova.compute.manager [req-f866e8de-f342-4bea-8d69-a27dc20c2071 req-f6e7a66e-92ae-4408-898e-52e1f6477721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received unexpected event network-vif-plugged-bedba378-9c7e-4b0d-b3ca-3111918e5cff for instance with vm_state stopped and task_state None.
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.748 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.749 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.750 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.750 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.750 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.766 182729 INFO nova.compute.manager [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Terminating instance
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.779 182729 DEBUG nova.compute.manager [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.787 182729 INFO nova.virt.libvirt.driver [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Instance destroyed successfully.
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.788 182729 DEBUG nova.objects.instance [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'resources' on Instance uuid 1297d9cc-74e6-4d32-b1c2-126898473271 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.805 182729 DEBUG nova.virt.libvirt.vif [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:37:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1612502526',display_name='tempest-Íñstáñcé-468627036',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1612502526',id=121,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:37:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-exit6fp1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:37:44Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=1297d9cc-74e6-4d32-b1c2-126898473271,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.806 182729 DEBUG nova.network.os_vif_util [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "address": "fa:16:3e:81:bb:e8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbedba378-9c", "ovs_interfaceid": "bedba378-9c7e-4b0d-b3ca-3111918e5cff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.807 182729 DEBUG nova.network.os_vif_util [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:bb:e8,bridge_name='br-int',has_traffic_filtering=True,id=bedba378-9c7e-4b0d-b3ca-3111918e5cff,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbedba378-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.808 182729 DEBUG os_vif [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:bb:e8,bridge_name='br-int',has_traffic_filtering=True,id=bedba378-9c7e-4b0d-b3ca-3111918e5cff,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbedba378-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.810 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.811 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbedba378-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.813 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.815 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.819 182729 INFO os_vif [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:bb:e8,bridge_name='br-int',has_traffic_filtering=True,id=bedba378-9c7e-4b0d-b3ca-3111918e5cff,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbedba378-9c')
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.820 182729 INFO nova.virt.libvirt.driver [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Deleting instance files /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271_del
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.821 182729 INFO nova.virt.libvirt.driver [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Deletion of /var/lib/nova/instances/1297d9cc-74e6-4d32-b1c2-126898473271_del complete
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.896 182729 INFO nova.compute.manager [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Took 0.12 seconds to destroy the instance on the hypervisor.
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.897 182729 DEBUG oslo.service.loopingcall [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.898 182729 DEBUG nova.compute.manager [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:37:45 compute-0 nova_compute[182725]: 2026-01-22 22:37:45.898 182729 DEBUG nova.network.neutron [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.330 182729 DEBUG nova.network.neutron [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.356 182729 INFO nova.compute.manager [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Took 1.46 seconds to deallocate network for instance.
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.431 182729 DEBUG nova.compute.manager [req-e7f5111c-b180-46ca-b1cd-d2df069a31a2 req-9441a7c9-1aa2-4423-a9c9-a06eb9c3046d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Received event network-vif-deleted-bedba378-9c7e-4b0d-b3ca-3111918e5cff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.451 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.451 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.540 182729 DEBUG nova.compute.provider_tree [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.556 182729 DEBUG nova.scheduler.client.report [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.583 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.609 182729 INFO nova.scheduler.client.report [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Deleted allocations for instance 1297d9cc-74e6-4d32-b1c2-126898473271
Jan 22 22:37:47 compute-0 nova_compute[182725]: 2026-01-22 22:37:47.711 182729 DEBUG oslo_concurrency.lockutils [None req-0e611ddb-0e93-4974-af4f-d425dd68a850 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "1297d9cc-74e6-4d32-b1c2-126898473271" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.962s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.127 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.254 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.255 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.255 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.256 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.256 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.274 182729 INFO nova.compute.manager [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Terminating instance
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.291 182729 DEBUG nova.compute.manager [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:37:49 compute-0 kernel: tap9b7c9dcb-22 (unregistering): left promiscuous mode
Jan 22 22:37:49 compute-0 NetworkManager[54954]: <info>  [1769121469.3298] device (tap9b7c9dcb-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:37:49 compute-0 ovn_controller[94850]: 2026-01-22T22:37:49Z|00466|binding|INFO|Releasing lport 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 from this chassis (sb_readonly=0)
Jan 22 22:37:49 compute-0 ovn_controller[94850]: 2026-01-22T22:37:49Z|00467|binding|INFO|Setting lport 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 down in Southbound
Jan 22 22:37:49 compute-0 ovn_controller[94850]: 2026-01-22T22:37:49Z|00468|binding|INFO|Removing iface tap9b7c9dcb-22 ovn-installed in OVS
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.343 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.356 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:1b:c8 10.100.0.14'], port_security=['fa:16:3e:aa:1b:c8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '49096ce9-494c-4c84-b263-86a05230d8af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.359 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 unbound from our chassis
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.361 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.361 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.363 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[27c18285-fe50-496d-9e7e-6b733e7be1bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.364 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 namespace which is not needed anymore
Jan 22 22:37:49 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Deactivated successfully.
Jan 22 22:37:49 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000070.scope: Consumed 17.811s CPU time.
Jan 22 22:37:49 compute-0 systemd-machined[154006]: Machine qemu-50-instance-00000070 terminated.
Jan 22 22:37:49 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [NOTICE]   (227258) : haproxy version is 2.8.14-c23fe91
Jan 22 22:37:49 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [NOTICE]   (227258) : path to executable is /usr/sbin/haproxy
Jan 22 22:37:49 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [WARNING]  (227258) : Exiting Master process...
Jan 22 22:37:49 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [WARNING]  (227258) : Exiting Master process...
Jan 22 22:37:49 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [ALERT]    (227258) : Current worker (227260) exited with code 143 (Terminated)
Jan 22 22:37:49 compute-0 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227254]: [WARNING]  (227258) : All workers exited. Exiting... (0)
Jan 22 22:37:49 compute-0 systemd[1]: libpod-7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e.scope: Deactivated successfully.
Jan 22 22:37:49 compute-0 podman[228708]: 2026-01-22 22:37:49.518872304 +0000 UTC m=+0.049096296 container died 7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:37:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e-userdata-shm.mount: Deactivated successfully.
Jan 22 22:37:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-78185cb462a48e133227c168e3ae057c635ad4861d3609b2f0033639219b978c-merged.mount: Deactivated successfully.
Jan 22 22:37:49 compute-0 podman[228708]: 2026-01-22 22:37:49.56395587 +0000 UTC m=+0.094179872 container cleanup 7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.565 182729 INFO nova.virt.libvirt.driver [-] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Instance destroyed successfully.
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.565 182729 DEBUG nova.objects.instance [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'resources' on Instance uuid 49096ce9-494c-4c84-b263-86a05230d8af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:49 compute-0 systemd[1]: libpod-conmon-7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e.scope: Deactivated successfully.
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.602 182729 DEBUG nova.virt.libvirt.vif [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-1722590006',display_name='tempest-₡-1722590006',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1722590006',id=112,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-5ci62lkw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:35:46Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=49096ce9-494c-4c84-b263-86a05230d8af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.603 182729 DEBUG nova.network.os_vif_util [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "address": "fa:16:3e:aa:1b:c8", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b7c9dcb-22", "ovs_interfaceid": "9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.604 182729 DEBUG nova.network.os_vif_util [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:aa:1b:c8,bridge_name='br-int',has_traffic_filtering=True,id=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b7c9dcb-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.604 182729 DEBUG os_vif [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:1b:c8,bridge_name='br-int',has_traffic_filtering=True,id=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b7c9dcb-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.605 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.605 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b7c9dcb-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.607 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.609 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.611 182729 INFO os_vif [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:1b:c8,bridge_name='br-int',has_traffic_filtering=True,id=9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b7c9dcb-22')
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.611 182729 INFO nova.virt.libvirt.driver [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Deleting instance files /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af_del
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.612 182729 INFO nova.virt.libvirt.driver [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Deletion of /var/lib/nova/instances/49096ce9-494c-4c84-b263-86a05230d8af_del complete
Jan 22 22:37:49 compute-0 podman[228754]: 2026-01-22 22:37:49.636540651 +0000 UTC m=+0.046853520 container remove 7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.643 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[85c78292-eb59-4cb2-9aaf-3cc5d284ec8b]: (4, ('Thu Jan 22 10:37:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 (7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e)\n7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e\nThu Jan 22 10:37:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 (7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e)\n7e76d6e37fa08b0aba8a9abcbf34fbf8dff66a53eaf9dcf242a9520e4e6d442e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.645 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d18f121b-76b3-45e2-93c0-588bc9f70b11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.646 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.648 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 kernel: tap8cfbdc2a-d0: left promiscuous mode
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.660 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.663 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ba4436-0ab7-4cec-a571-354b9a01f434]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.673 182729 DEBUG nova.compute.manager [req-2fe5ceca-2499-4a2a-86f7-0ef0637effd6 req-0d78c718-67ec-48e5-b204-8c3030c365f3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received event network-vif-unplugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.674 182729 DEBUG oslo_concurrency.lockutils [req-2fe5ceca-2499-4a2a-86f7-0ef0637effd6 req-0d78c718-67ec-48e5-b204-8c3030c365f3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.674 182729 DEBUG oslo_concurrency.lockutils [req-2fe5ceca-2499-4a2a-86f7-0ef0637effd6 req-0d78c718-67ec-48e5-b204-8c3030c365f3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.674 182729 DEBUG oslo_concurrency.lockutils [req-2fe5ceca-2499-4a2a-86f7-0ef0637effd6 req-0d78c718-67ec-48e5-b204-8c3030c365f3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.674 182729 DEBUG nova.compute.manager [req-2fe5ceca-2499-4a2a-86f7-0ef0637effd6 req-0d78c718-67ec-48e5-b204-8c3030c365f3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] No waiting events found dispatching network-vif-unplugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.675 182729 DEBUG nova.compute.manager [req-2fe5ceca-2499-4a2a-86f7-0ef0637effd6 req-0d78c718-67ec-48e5-b204-8c3030c365f3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received event network-vif-unplugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.678 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3a4192e2-1e45-40d9-8b2b-5261c5d54236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.680 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[35b1242f-45b3-443d-8c77-27b3fb1a248c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.689 182729 INFO nova.compute.manager [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.689 182729 DEBUG oslo.service.loopingcall [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.690 182729 DEBUG nova.compute.manager [-] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:37:49 compute-0 nova_compute[182725]: 2026-01-22 22:37:49.690 182729 DEBUG nova.network.neutron [-] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.705 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9c085dd7-abda-4d2e-84dc-6fafbf8124af]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498320, 'reachable_time': 20397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228769, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.708 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:37:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:37:49.708 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[256597e9-3732-413b-8b60-fabed425ff61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:37:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d8cfbdc2a\x2dd644\x2d40be\x2db1e2\x2d2d2471aaf695.mount: Deactivated successfully.
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.577 182729 DEBUG nova.network.neutron [-] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.598 182729 INFO nova.compute.manager [-] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Took 0.91 seconds to deallocate network for instance.
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.686 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.686 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.734 182729 DEBUG nova.compute.provider_tree [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.750 182729 DEBUG nova.scheduler.client.report [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.772 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.821 182729 INFO nova.scheduler.client.report [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Deleted allocations for instance 49096ce9-494c-4c84-b263-86a05230d8af
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.914 182729 DEBUG oslo_concurrency.lockutils [None req-a7f6edf6-65a6-4589-a462-4c245aafbd9f 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:50 compute-0 nova_compute[182725]: 2026-01-22 22:37:50.940 182729 DEBUG nova.compute.manager [req-40394fa1-fa59-4a4f-9f59-1c6f6a813c01 req-635157e4-5b48-481c-8394-4335ecd83002 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received event network-vif-deleted-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:51 compute-0 nova_compute[182725]: 2026-01-22 22:37:51.865 182729 DEBUG nova.compute.manager [req-252af40d-151d-4619-a909-f14a8a3842be req-808a125e-32fc-4cd1-95bf-f6616eb85926 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received event network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:37:51 compute-0 nova_compute[182725]: 2026-01-22 22:37:51.866 182729 DEBUG oslo_concurrency.lockutils [req-252af40d-151d-4619-a909-f14a8a3842be req-808a125e-32fc-4cd1-95bf-f6616eb85926 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49096ce9-494c-4c84-b263-86a05230d8af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:51 compute-0 nova_compute[182725]: 2026-01-22 22:37:51.867 182729 DEBUG oslo_concurrency.lockutils [req-252af40d-151d-4619-a909-f14a8a3842be req-808a125e-32fc-4cd1-95bf-f6616eb85926 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:51 compute-0 nova_compute[182725]: 2026-01-22 22:37:51.867 182729 DEBUG oslo_concurrency.lockutils [req-252af40d-151d-4619-a909-f14a8a3842be req-808a125e-32fc-4cd1-95bf-f6616eb85926 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49096ce9-494c-4c84-b263-86a05230d8af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:51 compute-0 nova_compute[182725]: 2026-01-22 22:37:51.867 182729 DEBUG nova.compute.manager [req-252af40d-151d-4619-a909-f14a8a3842be req-808a125e-32fc-4cd1-95bf-f6616eb85926 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] No waiting events found dispatching network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:37:51 compute-0 nova_compute[182725]: 2026-01-22 22:37:51.868 182729 WARNING nova.compute.manager [req-252af40d-151d-4619-a909-f14a8a3842be req-808a125e-32fc-4cd1-95bf-f6616eb85926 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Received unexpected event network-vif-plugged-9b7c9dcb-22ae-4b93-b9f9-86115fffa8e4 for instance with vm_state deleted and task_state None.
Jan 22 22:37:54 compute-0 nova_compute[182725]: 2026-01-22 22:37:54.129 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:54 compute-0 podman[228770]: 2026-01-22 22:37:54.184679029 +0000 UTC m=+0.106059619 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:37:54 compute-0 nova_compute[182725]: 2026-01-22 22:37:54.609 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:55 compute-0 nova_compute[182725]: 2026-01-22 22:37:55.110 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:57 compute-0 nova_compute[182725]: 2026-01-22 22:37:57.288 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121462.2868366, 1297d9cc-74e6-4d32-b1c2-126898473271 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:37:57 compute-0 nova_compute[182725]: 2026-01-22 22:37:57.288 182729 INFO nova.compute.manager [-] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] VM Stopped (Lifecycle Event)
Jan 22 22:37:57 compute-0 nova_compute[182725]: 2026-01-22 22:37:57.309 182729 DEBUG nova.compute.manager [None req-aa0b795a-a197-47fd-92b3-9c6cdc1a6764 - - - - - -] [instance: 1297d9cc-74e6-4d32-b1c2-126898473271] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.072 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.072 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.096 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.106 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.106 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.126 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.206 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.206 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.213 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.214 182729 INFO nova.compute.claims [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.235 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.470 182729 DEBUG nova.compute.provider_tree [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.483 182729 DEBUG nova.scheduler.client.report [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.503 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.504 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.505 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.512 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.512 182729 INFO nova.compute.claims [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.588 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.588 182729 DEBUG nova.network.neutron [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.608 182729 INFO nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.625 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.689 182729 DEBUG nova.compute.provider_tree [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.717 182729 DEBUG nova.scheduler.client.report [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.789 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.790 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.813 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.815 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.815 182729 INFO nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Creating image(s)
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.816 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "/var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.816 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.817 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.835 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.880 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.881 182729 DEBUG nova.network.neutron [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.897 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.898 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.899 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.915 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.938 182729 DEBUG nova.policy [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.942 182729 INFO nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.964 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.983 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:58 compute-0 nova_compute[182725]: 2026-01-22 22:37:58.984 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.040 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.041 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.041 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.075 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.077 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.078 182729 INFO nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Creating image(s)
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.078 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.079 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.079 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.098 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.118 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.119 182729 DEBUG nova.virt.disk.api [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Checking if we can resize image /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.119 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 podman[228799]: 2026-01-22 22:37:59.136576836 +0000 UTC m=+0.058928192 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.138 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.163 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 podman[228797]: 2026-01-22 22:37:59.164622876 +0000 UTC m=+0.089926626 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.164 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.165 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.182 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.202 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.203 182729 DEBUG nova.virt.disk.api [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Cannot resize image /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.203 182729 DEBUG nova.objects.instance [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'migration_context' on Instance uuid 254e913f-3968-436b-afcc-e51c2350b232 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.216 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.216 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Ensure instance console log exists: /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.217 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.217 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.217 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.242 182729 DEBUG nova.policy [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.245 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.246 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.295 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.296 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.296 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.377 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.377 182729 DEBUG nova.virt.disk.api [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.378 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.432 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.433 182729 DEBUG nova.virt.disk.api [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.433 182729 DEBUG nova.objects.instance [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid f71aa702-00f6-400b-aa58-458e9e6d6b6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.449 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.449 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Ensure instance console log exists: /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.450 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.450 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.451 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.606 182729 DEBUG nova.network.neutron [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Successfully created port: 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:37:59 compute-0 nova_compute[182725]: 2026-01-22 22:37:59.612 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.057 182729 DEBUG nova.network.neutron [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Successfully created port: 4db284ba-8233-4db0-9bf5-367c86c67a4e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.946 182729 DEBUG nova.network.neutron [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Successfully updated port: 4db284ba-8233-4db0-9bf5-367c86c67a4e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.961 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.962 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.962 182729 DEBUG nova.network.neutron [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.983 182729 DEBUG nova.network.neutron [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Successfully updated port: 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.997 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.997 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:38:00 compute-0 nova_compute[182725]: 2026-01-22 22:38:00.997 182729 DEBUG nova.network.neutron [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:38:01 compute-0 nova_compute[182725]: 2026-01-22 22:38:01.044 182729 DEBUG nova.compute.manager [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-changed-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:38:01 compute-0 nova_compute[182725]: 2026-01-22 22:38:01.044 182729 DEBUG nova.compute.manager [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Refreshing instance network info cache due to event network-changed-4db284ba-8233-4db0-9bf5-367c86c67a4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:38:01 compute-0 nova_compute[182725]: 2026-01-22 22:38:01.045 182729 DEBUG oslo_concurrency.lockutils [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:38:01 compute-0 nova_compute[182725]: 2026-01-22 22:38:01.139 182729 DEBUG nova.network.neutron [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:38:01 compute-0 nova_compute[182725]: 2026-01-22 22:38:01.176 182729 DEBUG nova.network.neutron [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.122 182729 DEBUG nova.network.neutron [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Updating instance_info_cache with network_info: [{"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.135 182729 DEBUG nova.network.neutron [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updating instance_info_cache with network_info: [{"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.149 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.149 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Instance network_info: |[{"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.150 182729 DEBUG oslo_concurrency.lockutils [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.150 182729 DEBUG nova.network.neutron [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Refreshing network info cache for port 4db284ba-8233-4db0-9bf5-367c86c67a4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.153 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Start _get_guest_xml network_info=[{"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.155 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.155 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Instance network_info: |[{"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.158 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Start _get_guest_xml network_info=[{"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.162 182729 WARNING nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.163 182729 WARNING nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.167 182729 DEBUG nova.virt.libvirt.host [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.168 182729 DEBUG nova.virt.libvirt.host [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.170 182729 DEBUG nova.virt.libvirt.host [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.170 182729 DEBUG nova.virt.libvirt.host [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.171 182729 DEBUG nova.virt.libvirt.host [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.172 182729 DEBUG nova.virt.libvirt.host [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.173 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.173 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.173 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.173 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.174 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.174 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.174 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.174 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.174 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.175 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.175 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.175 182729 DEBUG nova.virt.hardware [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.178 182729 DEBUG nova.virt.libvirt.vif [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1598344943',display_name='tempest-TestNetworkBasicOps-server-1598344943',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1598344943',id=124,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLY2vyl2wIGtoXWLaNeiGl2aNy8WzaO6IAdUibsEvR5qs8jDPbHCYPSLN+Zm5D1XcPbSC7cz/epYXZRIoDeWrQZEwZQpbXmA2aITKetpp9p9rogfzv5DRlF5GF9fOv5A0Q==',key_name='tempest-TestNetworkBasicOps-1464806580',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-a2tmt2r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:58Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=f71aa702-00f6-400b-aa58-458e9e6d6b6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.179 182729 DEBUG nova.network.os_vif_util [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.180 182729 DEBUG nova.network.os_vif_util [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:9a:60,bridge_name='br-int',has_traffic_filtering=True,id=4db284ba-8233-4db0-9bf5-367c86c67a4e,network=Network(faab65d1-5c71-4952-afd3-9e2ee1603831),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db284ba-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.181 182729 DEBUG nova.objects.instance [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid f71aa702-00f6-400b-aa58-458e9e6d6b6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.184 182729 DEBUG nova.virt.libvirt.host [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.185 182729 DEBUG nova.virt.libvirt.host [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.185 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.186 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.186 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.186 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.186 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.187 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.187 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.187 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.187 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.187 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.187 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.188 182729 DEBUG nova.virt.hardware [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.190 182729 DEBUG nova.virt.libvirt.vif [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-424272542',display_name='tempest-ServerActionsTestOtherB-server-424272542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-424272542',id=123,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-5vfbzi0x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:58Z,user_data=None,user_id='8b15fdf3e23640a2b9579790941bb346',uuid=254e913f-3968-436b-afcc-e51c2350b232,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.191 182729 DEBUG nova.network.os_vif_util [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.191 182729 DEBUG nova.network.os_vif_util [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:e4:29,bridge_name='br-int',has_traffic_filtering=True,id=354f33c9-4c2e-4d16-bcc6-072c571ea8a3,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap354f33c9-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.192 182729 DEBUG nova.objects.instance [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'pci_devices' on Instance uuid 254e913f-3968-436b-afcc-e51c2350b232 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.196 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <uuid>f71aa702-00f6-400b-aa58-458e9e6d6b6d</uuid>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <name>instance-0000007c</name>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkBasicOps-server-1598344943</nova:name>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:38:02</nova:creationTime>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:port uuid="4db284ba-8233-4db0-9bf5-367c86c67a4e">
Jan 22 22:38:02 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <system>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="serial">f71aa702-00f6-400b-aa58-458e9e6d6b6d</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="uuid">f71aa702-00f6-400b-aa58-458e9e6d6b6d</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </system>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <os>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </os>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <features>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </features>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.config"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:34:9a:60"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <target dev="tap4db284ba-82"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/console.log" append="off"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <video>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </video>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:38:02 compute-0 nova_compute[182725]: </domain>
Jan 22 22:38:02 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.197 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Preparing to wait for external event network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.197 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.198 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.198 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.198 182729 DEBUG nova.virt.libvirt.vif [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1598344943',display_name='tempest-TestNetworkBasicOps-server-1598344943',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1598344943',id=124,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLY2vyl2wIGtoXWLaNeiGl2aNy8WzaO6IAdUibsEvR5qs8jDPbHCYPSLN+Zm5D1XcPbSC7cz/epYXZRIoDeWrQZEwZQpbXmA2aITKetpp9p9rogfzv5DRlF5GF9fOv5A0Q==',key_name='tempest-TestNetworkBasicOps-1464806580',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-a2tmt2r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:58Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=f71aa702-00f6-400b-aa58-458e9e6d6b6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.199 182729 DEBUG nova.network.os_vif_util [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.199 182729 DEBUG nova.network.os_vif_util [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:9a:60,bridge_name='br-int',has_traffic_filtering=True,id=4db284ba-8233-4db0-9bf5-367c86c67a4e,network=Network(faab65d1-5c71-4952-afd3-9e2ee1603831),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db284ba-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.199 182729 DEBUG os_vif [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:9a:60,bridge_name='br-int',has_traffic_filtering=True,id=4db284ba-8233-4db0-9bf5-367c86c67a4e,network=Network(faab65d1-5c71-4952-afd3-9e2ee1603831),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db284ba-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.201 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.201 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.204 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.204 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4db284ba-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.204 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4db284ba-82, col_values=(('external_ids', {'iface-id': '4db284ba-8233-4db0-9bf5-367c86c67a4e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:9a:60', 'vm-uuid': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.205 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 NetworkManager[54954]: <info>  [1769121482.2066] manager: (tap4db284ba-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.208 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.211 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <uuid>254e913f-3968-436b-afcc-e51c2350b232</uuid>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <name>instance-0000007b</name>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestOtherB-server-424272542</nova:name>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:38:02</nova:creationTime>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:user uuid="8b15fdf3e23640a2b9579790941bb346">tempest-ServerActionsTestOtherB-1598778832-project-member</nova:user>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:project uuid="abdd987d004046138277253df8658aca">tempest-ServerActionsTestOtherB-1598778832</nova:project>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         <nova:port uuid="354f33c9-4c2e-4d16-bcc6-072c571ea8a3">
Jan 22 22:38:02 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <system>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="serial">254e913f-3968-436b-afcc-e51c2350b232</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="uuid">254e913f-3968-436b-afcc-e51c2350b232</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </system>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <os>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </os>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <features>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </features>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk.config"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:7d:e4:29"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <target dev="tap354f33c9-4c"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/console.log" append="off"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <video>
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </video>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:38:02 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:38:02 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:38:02 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:38:02 compute-0 nova_compute[182725]: </domain>
Jan 22 22:38:02 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.213 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Preparing to wait for external event network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.213 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.214 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.214 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.216 182729 DEBUG nova.virt.libvirt.vif [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:37:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-424272542',display_name='tempest-ServerActionsTestOtherB-server-424272542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-424272542',id=123,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-5vfbzi0x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:58Z,user_data=None,user_id='8b15fdf3e23640a2b9579790941bb346',uuid=254e913f-3968-436b-afcc-e51c2350b232,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.216 182729 DEBUG nova.network.os_vif_util [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.217 182729 DEBUG nova.network.os_vif_util [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:e4:29,bridge_name='br-int',has_traffic_filtering=True,id=354f33c9-4c2e-4d16-bcc6-072c571ea8a3,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap354f33c9-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.218 182729 DEBUG os_vif [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:e4:29,bridge_name='br-int',has_traffic_filtering=True,id=354f33c9-4c2e-4d16-bcc6-072c571ea8a3,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap354f33c9-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.219 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.220 182729 INFO os_vif [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:9a:60,bridge_name='br-int',has_traffic_filtering=True,id=4db284ba-8233-4db0-9bf5-367c86c67a4e,network=Network(faab65d1-5c71-4952-afd3-9e2ee1603831),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db284ba-82')
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.222 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.222 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.222 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.225 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.225 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354f33c9-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.226 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap354f33c9-4c, col_values=(('external_ids', {'iface-id': '354f33c9-4c2e-4d16-bcc6-072c571ea8a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:e4:29', 'vm-uuid': '254e913f-3968-436b-afcc-e51c2350b232'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.227 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 NetworkManager[54954]: <info>  [1769121482.2279] manager: (tap354f33c9-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.228 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.234 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.235 182729 INFO os_vif [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:e4:29,bridge_name='br-int',has_traffic_filtering=True,id=354f33c9-4c2e-4d16-bcc6-072c571ea8a3,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap354f33c9-4c')
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.301 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.301 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.302 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No VIF found with MAC fa:16:3e:7d:e4:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.302 182729 INFO nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Using config drive
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.304 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.304 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.305 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:34:9a:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.305 182729 INFO nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Using config drive
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.478 182729 DEBUG nova.compute.manager [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received event network-changed-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.479 182729 DEBUG nova.compute.manager [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Refreshing instance network info cache due to event network-changed-354f33c9-4c2e-4d16-bcc6-072c571ea8a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.479 182729 DEBUG oslo_concurrency.lockutils [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.479 182729 DEBUG oslo_concurrency.lockutils [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.479 182729 DEBUG nova.network.neutron [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Refreshing network info cache for port 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.993 182729 INFO nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Creating config drive at /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.config
Jan 22 22:38:02 compute-0 nova_compute[182725]: 2026-01-22 22:38:02.999 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo82kkvzl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.038 182729 INFO nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Creating config drive at /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk.config
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.046 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmzp2yod7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.152 182729 DEBUG oslo_concurrency.processutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo82kkvzl" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.180 182729 DEBUG oslo_concurrency.processutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmzp2yod7" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:38:03 compute-0 kernel: tap4db284ba-82: entered promiscuous mode
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2186] manager: (tap4db284ba-82): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00469|binding|INFO|Claiming lport 4db284ba-8233-4db0-9bf5-367c86c67a4e for this chassis.
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.221 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00470|binding|INFO|4db284ba-8233-4db0-9bf5-367c86c67a4e: Claiming fa:16:3e:34:9a:60 10.100.0.6
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.237 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:9a:60 10.100.0.6'], port_security=['fa:16:3e:34:9a:60 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-faab65d1-5c71-4952-afd3-9e2ee1603831', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3206d97c-4936-4b4c-81d9-27b6ef63ed2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fce7f5f4-510e-4886-b82a-0e9c0af2a5e1, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=4db284ba-8233-4db0-9bf5-367c86c67a4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.238 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 4db284ba-8233-4db0-9bf5-367c86c67a4e in datapath faab65d1-5c71-4952-afd3-9e2ee1603831 bound to our chassis
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.240 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network faab65d1-5c71-4952-afd3-9e2ee1603831
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2495] manager: (tap354f33c9-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/220)
Jan 22 22:38:03 compute-0 systemd-udevd[228902]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.251 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[351483ed-30e5-460d-a65d-5f2e4aa91c8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.252 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfaab65d1-51 in ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:38:03 compute-0 systemd-udevd[228898]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.254 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfaab65d1-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.254 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a51dbff1-c5fd-4c7a-bd51-c08eabf2596e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.254 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1a3517c0-846b-4fb4-9fd0-70d4b9a82bd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2656] device (tap4db284ba-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2666] device (tap4db284ba-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.271 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[38545d63-92c0-4d0a-8b94-3a0b8b585c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 systemd-machined[154006]: New machine qemu-55-instance-0000007c.
Jan 22 22:38:03 compute-0 kernel: tap354f33c9-4c: entered promiscuous mode
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2751] device (tap354f33c9-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2758] device (tap354f33c9-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.276 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00471|binding|INFO|Claiming lport 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 for this chassis.
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00472|binding|INFO|354f33c9-4c2e-4d16-bcc6-072c571ea8a3: Claiming fa:16:3e:7d:e4:29 10.100.0.8
Jan 22 22:38:03 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-0000007c.
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.287 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2880] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.2888] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.295 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:e4:29 10.100.0.8'], port_security=['fa:16:3e:7d:e4:29 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '254e913f-3968-436b-afcc-e51c2350b232', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2d993940-8666-43d7-8759-418fc1311e0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=354f33c9-4c2e-4d16-bcc6-072c571ea8a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.295 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[87a66ad1-e642-4400-b93b-f85a5135f998]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00473|binding|INFO|Setting lport 4db284ba-8233-4db0-9bf5-367c86c67a4e ovn-installed in OVS
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00474|binding|INFO|Setting lport 4db284ba-8233-4db0-9bf5-367c86c67a4e up in Southbound
Jan 22 22:38:03 compute-0 systemd-machined[154006]: New machine qemu-56-instance-0000007b.
Jan 22 22:38:03 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-0000007b.
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.311 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.328 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e228dcef-be17-4562-beea-842bc0d55dc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.3452] manager: (tapfaab65d1-50): new Veth device (/org/freedesktop/NetworkManager/Devices/223)
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.344 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a40741-5145-48d8-8ef2-6635d0c1b439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.376 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4f81bb-e6b0-4327-90f1-41f0183000ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.379 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[ac89f3ab-c72b-4d26-aa34-cebea3a677ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.3995] device (tapfaab65d1-50): carrier: link connected
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.406 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[00cfc6b1-5f66-4d73-bd0f-3679e60d576e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.426 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5ace26cc-43d7-47f3-9bc0-17667266f160]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfaab65d1-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:65:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512100, 'reachable_time': 36681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228945, 'error': None, 'target': 'ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00475|binding|INFO|Setting lport 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 ovn-installed in OVS
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00476|binding|INFO|Setting lport 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 up in Southbound
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.436 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.444 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3d5bac8f-7298-43ed-ad7f-102f6849fa24]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:65ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512100, 'tstamp': 512100}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228952, 'error': None, 'target': 'ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.459 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aed11935-b37d-47fd-a3b6-f1dcc1a6bee6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfaab65d1-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:65:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512100, 'reachable_time': 36681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228953, 'error': None, 'target': 'ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.490 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c3524571-332a-4bea-96b6-426a75f36835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.511 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121483.5114818, f71aa702-00f6-400b-aa58-458e9e6d6b6d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.512 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] VM Started (Lifecycle Event)
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.535 182729 DEBUG nova.compute.manager [req-a53fad93-3778-42d5-94f6-a2ac82f17096 req-23a6b04c-11a8-494b-9484-3c7ec616e2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.536 182729 DEBUG oslo_concurrency.lockutils [req-a53fad93-3778-42d5-94f6-a2ac82f17096 req-23a6b04c-11a8-494b-9484-3c7ec616e2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.536 182729 DEBUG oslo_concurrency.lockutils [req-a53fad93-3778-42d5-94f6-a2ac82f17096 req-23a6b04c-11a8-494b-9484-3c7ec616e2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.536 182729 DEBUG oslo_concurrency.lockutils [req-a53fad93-3778-42d5-94f6-a2ac82f17096 req-23a6b04c-11a8-494b-9484-3c7ec616e2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.536 182729 DEBUG nova.compute.manager [req-a53fad93-3778-42d5-94f6-a2ac82f17096 req-23a6b04c-11a8-494b-9484-3c7ec616e2a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Processing event network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.537 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.538 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.542 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121483.5116112, f71aa702-00f6-400b-aa58-458e9e6d6b6d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.542 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] VM Paused (Lifecycle Event)
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.548 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.551 182729 INFO nova.virt.libvirt.driver [-] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Instance spawned successfully.
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.551 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.559 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[22999c40-4677-43e1-b4a6-5337851f994d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.560 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfaab65d1-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.560 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.561 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfaab65d1-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:03 compute-0 kernel: tapfaab65d1-50: entered promiscuous mode
Jan 22 22:38:03 compute-0 NetworkManager[54954]: <info>  [1769121483.5631] manager: (tapfaab65d1-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.566 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfaab65d1-50, col_values=(('external_ids', {'iface-id': 'cb8a3e75-5ba9-4a4a-9966-4ce2b8e6aa9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.567 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:03 compute-0 ovn_controller[94850]: 2026-01-22T22:38:03Z|00477|binding|INFO|Releasing lport cb8a3e75-5ba9-4a4a-9966-4ce2b8e6aa9f from this chassis (sb_readonly=0)
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.570 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.596 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/faab65d1-5c71-4952-afd3-9e2ee1603831.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/faab65d1-5c71-4952-afd3-9e2ee1603831.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.596 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.597 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e9fc06e7-76b1-44b2-a531-6266e5bff072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.598 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-faab65d1-5c71-4952-afd3-9e2ee1603831
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/faab65d1-5c71-4952-afd3-9e2ee1603831.pid.haproxy
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID faab65d1-5c71-4952-afd3-9e2ee1603831
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:38:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:03.600 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831', 'env', 'PROCESS_TAG=haproxy-faab65d1-5c71-4952-afd3-9e2ee1603831', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/faab65d1-5c71-4952-afd3-9e2ee1603831.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.601 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.602 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.602 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.602 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.603 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.603 182729 DEBUG nova.virt.libvirt.driver [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.607 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121483.5469065, f71aa702-00f6-400b-aa58-458e9e6d6b6d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.607 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] VM Resumed (Lifecycle Event)
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.657 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.661 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.705 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.787 182729 DEBUG nova.network.neutron [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Updated VIF entry in instance network info cache for port 4db284ba-8233-4db0-9bf5-367c86c67a4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.788 182729 DEBUG nova.network.neutron [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Updating instance_info_cache with network_info: [{"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.829 182729 INFO nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Took 4.75 seconds to spawn the instance on the hypervisor.
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.829 182729 DEBUG nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:03 compute-0 nova_compute[182725]: 2026-01-22 22:38:03.830 182729 DEBUG oslo_concurrency.lockutils [req-0ed2cec6-7b7a-4cbd-9473-a883e50ae593 req-f054939d-92b6-4b8c-b0be-36d6e7c054a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:38:03 compute-0 podman[228984]: 2026-01-22 22:38:03.980377634 +0000 UTC m=+0.047614389 container create 7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:38:04 compute-0 systemd[1]: Started libpod-conmon-7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4.scope.
Jan 22 22:38:04 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:38:04 compute-0 podman[228984]: 2026-01-22 22:38:03.952898408 +0000 UTC m=+0.020135183 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:38:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ccec6a7d11187e337db9eaccc026dc396c3cdebbe2c0e64a24c74ed803d4472/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.052 182729 INFO nova.compute.manager [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Took 5.85 seconds to build instance.
Jan 22 22:38:04 compute-0 podman[228984]: 2026-01-22 22:38:04.061757445 +0000 UTC m=+0.128994220 container init 7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:38:04 compute-0 podman[228984]: 2026-01-22 22:38:04.068833792 +0000 UTC m=+0.136070547 container start 7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.070 182729 DEBUG oslo_concurrency.lockutils [None req-2c41577e-01e3-46d6-a735-8ad0974d0c1a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:04 compute-0 neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831[228999]: [NOTICE]   (229003) : New worker (229005) forked
Jan 22 22:38:04 compute-0 neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831[228999]: [NOTICE]   (229003) : Loading success.
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.131 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 unbound from our chassis
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.132 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.133 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.144 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[62219560-b9e7-4157-8485-d066f046ffe2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.145 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap84d8b010-d1 in ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.147 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap84d8b010-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.147 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fa946a12-f176-40c9-9a1f-573ee2a5f716]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.148 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e97d7ffb-b04c-45e1-b3b7-40a8af82f4ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.159 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a94a106c-3df3-42c1-9301-c7fa71ce8e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.174 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[34d8550f-7029-4d75-9e3c-f1e83ec0b347]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.208 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d485098e-9f7c-4f2f-ba88-7f686afbb9af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.214 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f308fc-a44b-44b5-bcc3-cd417337df97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 NetworkManager[54954]: <info>  [1769121484.2154] manager: (tap84d8b010-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Jan 22 22:38:04 compute-0 systemd-udevd[228931]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.255 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[238f8001-0530-4f21-b3e7-feb65929c8d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.259 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[aae998db-5228-40a7-b011-af899fd4fe82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.278 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121484.2787056, 254e913f-3968-436b-afcc-e51c2350b232 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.279 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] VM Started (Lifecycle Event)
Jan 22 22:38:04 compute-0 NetworkManager[54954]: <info>  [1769121484.2934] device (tap84d8b010-d0): carrier: link connected
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.300 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[57d37041-40f7-4ead-88a9-9020c7afdffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.311 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.316 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121484.2813525, 254e913f-3968-436b-afcc-e51c2350b232 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.316 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] VM Paused (Lifecycle Event)
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.325 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1015b9-1228-4d2b-8cfc-53d710b17c9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512189, 'reachable_time': 40213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229031, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.342 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.344 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.346 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c5978a2e-cb47-48cd-b5be-200f36226f90]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:3d39'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512189, 'tstamp': 512189}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229032, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.363 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b3706b03-f0da-43c4-b771-58eb6996d860]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512189, 'reachable_time': 40213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229033, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.376 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.394 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[eb495569-f083-4344-9e4d-a1b828da2d80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.454 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4d63b90c-6bf6-4547-ab85-a325961da2db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.455 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.456 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.456 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84d8b010-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:04 compute-0 NetworkManager[54954]: <info>  [1769121484.4590] manager: (tap84d8b010-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.458 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:04 compute-0 kernel: tap84d8b010-d0: entered promiscuous mode
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.460 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.461 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84d8b010-d0, col_values=(('external_ids', {'iface-id': '8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.462 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:04 compute-0 ovn_controller[94850]: 2026-01-22T22:38:04Z|00478|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.469 182729 DEBUG nova.network.neutron [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updated VIF entry in instance network info cache for port 354f33c9-4c2e-4d16-bcc6-072c571ea8a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.470 182729 DEBUG nova.network.neutron [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updating instance_info_cache with network_info: [{"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.476 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.476 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.477 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.478 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0351406d-fc6d-44e1-89dd-4ac9fd52ec3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.478 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:38:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:04.480 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'env', 'PROCESS_TAG=haproxy-84d8b010-d968-4df4-bedf-0c350ae42113', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/84d8b010-d968-4df4-bedf-0c350ae42113.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.492 182729 DEBUG oslo_concurrency.lockutils [req-c69d9683-d451-4753-997c-07061e79ef69 req-74bca3c9-b9df-48f4-a1c3-082aeb6e645e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.562 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121469.5619664, 49096ce9-494c-4c84-b263-86a05230d8af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.563 182729 INFO nova.compute.manager [-] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] VM Stopped (Lifecycle Event)
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.584 182729 DEBUG nova.compute.manager [None req-a5c9f5c1-fb59-422c-ae47-cdf84995be7d - - - - - -] [instance: 49096ce9-494c-4c84-b263-86a05230d8af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.597 182729 DEBUG nova.compute.manager [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received event network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.598 182729 DEBUG oslo_concurrency.lockutils [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.598 182729 DEBUG oslo_concurrency.lockutils [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.598 182729 DEBUG oslo_concurrency.lockutils [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.598 182729 DEBUG nova.compute.manager [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Processing event network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.598 182729 DEBUG nova.compute.manager [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received event network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.598 182729 DEBUG oslo_concurrency.lockutils [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.599 182729 DEBUG oslo_concurrency.lockutils [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.599 182729 DEBUG oslo_concurrency.lockutils [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.599 182729 DEBUG nova.compute.manager [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] No waiting events found dispatching network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.599 182729 WARNING nova.compute.manager [req-6b2d9c41-757d-4849-bf99-ce144766483a req-60611f46-3d30-4a5f-bb63-8554fdfe0e0a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received unexpected event network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 for instance with vm_state building and task_state spawning.
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.599 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.603 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.606 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121484.6060624, 254e913f-3968-436b-afcc-e51c2350b232 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.606 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] VM Resumed (Lifecycle Event)
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.609 182729 INFO nova.virt.libvirt.driver [-] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Instance spawned successfully.
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.609 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.630 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.633 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.634 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.634 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.634 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.635 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.635 182729 DEBUG nova.virt.libvirt.driver [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.641 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.665 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.706 182729 INFO nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Took 5.89 seconds to spawn the instance on the hypervisor.
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.706 182729 DEBUG nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.798 182729 INFO nova.compute.manager [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Took 6.63 seconds to build instance.
Jan 22 22:38:04 compute-0 nova_compute[182725]: 2026-01-22 22:38:04.814 182729 DEBUG oslo_concurrency.lockutils [None req-16977dcb-b625-42c9-8446-28b8da43aacc 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:04 compute-0 podman[229066]: 2026-01-22 22:38:04.83316691 +0000 UTC m=+0.046288396 container create 159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:38:04 compute-0 systemd[1]: Started libpod-conmon-159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c.scope.
Jan 22 22:38:04 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:38:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0607957d6c5a59b155906644247d694d50bf3738a08e6d7f4feea2dc2a5c10f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:38:04 compute-0 podman[229066]: 2026-01-22 22:38:04.809372996 +0000 UTC m=+0.022494512 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:38:04 compute-0 podman[229066]: 2026-01-22 22:38:04.912243474 +0000 UTC m=+0.125364980 container init 159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:38:04 compute-0 podman[229066]: 2026-01-22 22:38:04.918461219 +0000 UTC m=+0.131582715 container start 159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:38:04 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [NOTICE]   (229086) : New worker (229088) forked
Jan 22 22:38:04 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [NOTICE]   (229086) : Loading success.
Jan 22 22:38:05 compute-0 nova_compute[182725]: 2026-01-22 22:38:05.636 182729 DEBUG nova.compute.manager [req-a634afe5-6f51-425d-9cc0-ce65c5110897 req-cf988bcb-0935-4f36-9408-1707f9482e73 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:38:05 compute-0 nova_compute[182725]: 2026-01-22 22:38:05.636 182729 DEBUG oslo_concurrency.lockutils [req-a634afe5-6f51-425d-9cc0-ce65c5110897 req-cf988bcb-0935-4f36-9408-1707f9482e73 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:05 compute-0 nova_compute[182725]: 2026-01-22 22:38:05.637 182729 DEBUG oslo_concurrency.lockutils [req-a634afe5-6f51-425d-9cc0-ce65c5110897 req-cf988bcb-0935-4f36-9408-1707f9482e73 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:05 compute-0 nova_compute[182725]: 2026-01-22 22:38:05.637 182729 DEBUG oslo_concurrency.lockutils [req-a634afe5-6f51-425d-9cc0-ce65c5110897 req-cf988bcb-0935-4f36-9408-1707f9482e73 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:05 compute-0 nova_compute[182725]: 2026-01-22 22:38:05.637 182729 DEBUG nova.compute.manager [req-a634afe5-6f51-425d-9cc0-ce65c5110897 req-cf988bcb-0935-4f36-9408-1707f9482e73 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] No waiting events found dispatching network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:38:05 compute-0 nova_compute[182725]: 2026-01-22 22:38:05.637 182729 WARNING nova.compute.manager [req-a634afe5-6f51-425d-9cc0-ce65c5110897 req-cf988bcb-0935-4f36-9408-1707f9482e73 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received unexpected event network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e for instance with vm_state active and task_state None.
Jan 22 22:38:06 compute-0 nova_compute[182725]: 2026-01-22 22:38:06.192 182729 INFO nova.compute.manager [None req-ddcf88e7-6d2e-495e-92cb-8bc4274797fb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Get console output
Jan 22 22:38:07 compute-0 sshd-session[229097]: Unable to negotiate with 45.79.67.127 port 49720: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 22 22:38:07 compute-0 nova_compute[182725]: 2026-01-22 22:38:07.228 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:07 compute-0 sshd-session[229101]: Unable to negotiate with 45.79.67.127 port 49682: no matching host key type found. Their offer: ssh-dss [preauth]
Jan 22 22:38:07 compute-0 sshd-session[229099]: Connection closed by 45.79.67.127 port 49688 [preauth]
Jan 22 22:38:08 compute-0 nova_compute[182725]: 2026-01-22 22:38:08.135 182729 DEBUG nova.compute.manager [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-changed-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:38:08 compute-0 nova_compute[182725]: 2026-01-22 22:38:08.136 182729 DEBUG nova.compute.manager [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Refreshing instance network info cache due to event network-changed-4db284ba-8233-4db0-9bf5-367c86c67a4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:38:08 compute-0 nova_compute[182725]: 2026-01-22 22:38:08.136 182729 DEBUG oslo_concurrency.lockutils [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:38:08 compute-0 nova_compute[182725]: 2026-01-22 22:38:08.136 182729 DEBUG oslo_concurrency.lockutils [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:38:08 compute-0 nova_compute[182725]: 2026-01-22 22:38:08.136 182729 DEBUG nova.network.neutron [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Refreshing network info cache for port 4db284ba-8233-4db0-9bf5-367c86c67a4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:38:08 compute-0 sshd-session[229100]: Connection closed by 45.79.67.127 port 49692 [preauth]
Jan 22 22:38:08 compute-0 sshd-session[229098]: Connection closed by 45.79.67.127 port 49706 [preauth]
Jan 22 22:38:09 compute-0 sshd-session[229102]: Unable to negotiate with 45.79.67.127 port 49716: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.112 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'name': 'tempest-TestNetworkBasicOps-server-1598344943', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ffd58948cb444c25ae034a02c0344de7', 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'hostId': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.116 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '254e913f-3968-436b-afcc-e51c2350b232', 'name': 'tempest-ServerActionsTestOtherB-server-424272542', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'abdd987d004046138277253df8658aca', 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'hostId': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:38:09 compute-0 podman[229111]: 2026-01-22 22:38:09.129953624 +0000 UTC m=+0.052364798 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:38:09 compute-0 podman[229110]: 2026-01-22 22:38:09.130651031 +0000 UTC m=+0.058650965 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:38:09 compute-0 nova_compute[182725]: 2026-01-22 22:38:09.136 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.150 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.150 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 podman[229109]: 2026-01-22 22:38:09.160870745 +0000 UTC m=+0.088874049 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.176 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.176 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e0cbf42-76d6-4c71-90c4-20189ab77eb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.117033', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06c21bbc-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': 'd1b9c70234da0717c5aad1c527862dc8254b970a97ccbe61d637b1e7f10b282a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.117033', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06c22896-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': 'ebe0cb813b72d48a5e99f28531c708e6758f3c6dbb314c87bcca985ef1b3d276'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.117033', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06c617a8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '083429aee1e371ab8f6413c97f38ba979daa59dfaeb0403dde6312bd6b81bfbe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.117033', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06c621a8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': 'e465b82399abc8f4e7b356edb22e4d0e83f4671addfe290c435d8bfd2ffdc992'}]}, 'timestamp': '2026-01-22 22:38:09.177141', '_unique_id': '7af36265965244188897962e1c8c9cef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.178 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.179 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.204 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.205 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f71aa702-00f6-400b-aa58-458e9e6d6b6d: ceilometer.compute.pollsters.NoVolumeException
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.229 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.229 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 254e913f-3968-436b-afcc-e51c2350b232: ceilometer.compute.pollsters.NoVolumeException
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.231 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f71aa702-00f6-400b-aa58-458e9e6d6b6d / tap4db284ba-82 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.231 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.233 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 254e913f-3968-436b-afcc-e51c2350b232 / tap354f33c9-4c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.234 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '672cd15b-40bc-4ec6-a314-5ac4f57d8cbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.229950', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06ce8730-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': '0573a27835ed4519fa2c9efd58b128e542e9436d9ac670e68e31034ce27fcd90'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.229950', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06ced884-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': '15a351120f28dde02705fccc6995249881cdf6388946ff46b1d55698c3b43c57'}]}, 'timestamp': '2026-01-22 22:38:09.234294', '_unique_id': '524458cfc6b84fbd876287d484d10b01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.235 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.236 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.236 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1046fe26-06bf-4326-85dc-6adeea797993', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.236256', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06cf2ffa-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': '73acd22e5e87117f09fcfa2f91a450697cd431b714a62222533e7d25af64b13e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.236256', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06cf408a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': '715e5b1c1372d853bb4b909bc85f6211bae85107bdb626d557d6dbe000fbc096'}]}, 'timestamp': '2026-01-22 22:38:09.236906', '_unique_id': 'f76ee71ff5484d95a5c6da05a943070f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.237 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.238 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.238 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>]
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.238 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.238 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a43413c9-0a31-4317-8b1b-4010df2e5072', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.238481', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06cf869e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': 'ab5dfa4d3c0a825d8078d99195d652ae7486dfc989e1470731fc89dbacbcf200'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.238481', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06cf8f68-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': '9c2b9ca2517438ad20961e9cf2e88669c9a7b4fa40a000b11b2ea6aefca3df14'}]}, 'timestamp': '2026-01-22 22:38:09.238921', '_unique_id': 'ab0133fe5f7c46c4b2bccca9bfb8385e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79573268-9f24-4770-9aac-169b966aca00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.240021', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06cfc302-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': '6bd6de9072c7a79f7a17c9499e8436883cc5f77f53981e86cdb3420964e5f9f8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.240021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06cfcb4a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': '1d0fd2662005b33d6de4618328306316f2659def4decf520bdd520de2fdd7b7e'}]}, 'timestamp': '2026-01-22 22:38:09.240455', '_unique_id': '21575fd3008840b1b6706762b55056d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.240 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.241 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.241 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/cpu volume: 5390000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.241 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/cpu volume: 4380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '806fad44-7f1b-4459-bc70-09cf8efd4e80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5390000000, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'timestamp': '2026-01-22T22:38:09.241526', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '06cffd9a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.862141834, 'message_signature': '388c214cb266fd9952725036b9a3c30a915d80cb8fd30458c672c7f28d922df5'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4380000000, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232', 'timestamp': '2026-01-22T22:38:09.241526', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '06d0061e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.886725738, 'message_signature': 'f165b7bc5bb823704b1fc8914394e0e78775c4e6490601345163de21c0f7446a'}]}, 'timestamp': '2026-01-22 22:38:09.241952', '_unique_id': '20c3b23fb43e458d84af40b075cbf503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.243 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.243 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>]
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.258 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.258 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.267 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.267 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68320bdc-5f9b-4b0c-b2ba-a0c0eba385cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.243267', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d29ce4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.900522302, 'message_signature': '3cd234483f9406fa52eda6388371ca8ad63ef86eca2768c51600f12a520c5264'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.243267', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d2a50e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.900522302, 'message_signature': 'c1ed414530efa51c0c3de67cb4c6e82d3a8c825783f31b925399a20f4439c368'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.243267', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d3e22a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.916363087, 'message_signature': 'f733378f3d70499b58008eac77cd746d9285af895488a31e2615eb3085299880'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.243267', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d3ea0e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.916363087, 'message_signature': '35f431fe72086cc3eacb4d78db91c122acd1b45d25c8dc0dcc51c76d97c12eec'}]}, 'timestamp': '2026-01-22 22:38:09.267450', '_unique_id': '5d6dc92634e2417d8780877a7d28520c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.268 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efd234e1-2612-4a22-928a-73f3ad65429b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.268854', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06d42960-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': 'e307984d13236a54761725b091a671be6ceaad4c9430d6d962e789b1b02f5296'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.268854', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06d4319e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': 'aab9aab835ee495c360f0ee4296050bf3d46a2d3313473a6c28d1078c972e321'}]}, 'timestamp': '2026-01-22 22:38:09.269288', '_unique_id': '0d1c7cf1ce374eb184afcc269f12ca54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.269 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.270 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.270 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.270 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.270 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.270 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af6382ad-e55d-4550-8a5f-d5703a019826', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.270354', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d4639e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.900522302, 'message_signature': 'eea56bf9642113e4ae844abdc9d97ef3da78a790549cb64466c017289e579429'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.270354', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d46b1e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.900522302, 'message_signature': '38942c6880523f7aa2698e5133dfdbb14139cf9b483da3c41ca9cfe29433c19f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.270354', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d4732a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.916363087, 'message_signature': 'eab44fb517bfca771be234aae58ef9aa8e590c1bf7befc9453ead0df02d6500a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.270354', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d47a82-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.916363087, 'message_signature': 'df4a69a3900ebf741895d10194f89fc3e2a35ad2a4e6779bccf977187913a435'}]}, 'timestamp': '2026-01-22 22:38:09.271143', '_unique_id': '75413e9d7f154b6bb5a3ff7b76707e86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.271 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.272 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.read.latency volume: 132168390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.272 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.read.latency volume: 354319 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.272 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.latency volume: 97235927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.272 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.latency volume: 392959 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a861429e-bb91-4f18-bbd4-bd4dc42ac802', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 132168390, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.272266', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d4ae58-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': '64846fed49bd791f778add8f3a78376dde22808d66fec687fcac9f063c551ef0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 354319, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.272266', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d4b5b0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': '79b27bb2cbe41c2f02ecf6e14ad47f3fb9e57eba115551df448b80d844953aa0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 97235927, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.272266', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d4bd12-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '16e678690d2c78ae41b192cf85d58e6786c0770c0612311bc44fdb3aa6eff6c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 392959, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.272266', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d4c50a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': 'd73a8171b1b3d1ae355158d8f0bb9d025d907ac6546e16fb44aa1dca24b20837'}]}, 'timestamp': '2026-01-22 22:38:09.273062', '_unique_id': '105a0021995e48ec869129accd8b9b78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.273 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.274 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.274 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.274 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e507b35-eceb-4629-a714-4fcecb9f4bad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.274161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06d4f868-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': '78a06451226a57e6cba0c001ad5c08be4c0c98d5610141c9324603311ce348f6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.274161', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06d50056-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': '00000563ac09a3d9b6244151c4a6f0d481235616a8c94512fe57b0b95e591473'}]}, 'timestamp': '2026-01-22 22:38:09.274577', '_unique_id': '76f23d2c4b58486c8880ef9a1815b25b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.275 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7e37729-170d-4c57-863e-ca052ed0f8ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.275626', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06d531a2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': '05f34178c674e2a9611bddb2bc1eb59e77019d119115eea906c2f45d76eb9be2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.275626', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06d53b16-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': 'bc3cdcf81cc11c6afc804ccbe9e7cf734783de9eb8184abdaff2960161466f82'}]}, 'timestamp': '2026-01-22 22:38:09.276083', '_unique_id': '325b7908504742b4872ba1f041c727f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.276 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.277 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.277 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.277 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>]
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.277 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.277 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.277 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.277 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e92ed79a-ab30-4b60-b0ba-796e317c318b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.277424', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d577c0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': '08c38208258877a0bba1aa4d2dbb7a8bbeaccdae2fc96bfe22270a69c6c58cf2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.277424', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d57f7c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': '69d5d5c7e549102dbd567d5bd543c1d520c6c1cb742e74b639cb7232df9974b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.277424', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d58792-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '7178995d8c78714d82466575f5737aa507f8de370283ec48a7042764f35b98ef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.277424', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d58ecc-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': 'ac54b5b496a7d143d804ade2366d3d57da03e9dedc6a767ae9d3f52a99f5c942'}]}, 'timestamp': '2026-01-22 22:38:09.278216', '_unique_id': '6ac5b86d145141d298c5abaf5e450780'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.278 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.279 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.279 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1598344943>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-424272542>]
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.279 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.279 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3841d33-c50d-4e89-a59b-5677ff42ba3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.279572', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d5cb80-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.900522302, 'message_signature': 'a5b9fe50fc4eeb9ac0d3c29ee62540835313297bda0614b0c4479335a060e69c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.279572', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d5d3fa-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.900522302, 'message_signature': 'bc9f555735964eeacb1405813b02c9b2f72ad94ab90904db715773212e58de54'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.279572', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d5dc2e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.916363087, 'message_signature': '4f00bcd5f094ff0fad4faa00bf73bf9d0001892201a24cea86ec70cfe66ef5f6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.279572', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d5e372-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.916363087, 'message_signature': 'a1cfef231dceb2d74ba992a016759766a3cc6b9597c313752c12ebf53c6d3d89'}]}, 'timestamp': '2026-01-22 22:38:09.280383', '_unique_id': '7b4a87b894794405b23fedcadd310926'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.280 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.281 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.281 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.281 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.281 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92d6e9d4-1347-4023-a35f-936288079689', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.281471', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d615e0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': 'dd5d2da75d6dbebe615906066c683204abb94dad731670aeb6a72177cadaeae3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.281471', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d62148-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': '7a8c3b6a01cbb59ec7f3ab562a9de4caaeb212df3532f7c2026c06616a3dca33'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.281471', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d628be-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '88e2a32a83aa608ad380efb044343597d352aee33d4eff393f2c4518262506b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.281471', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d62ff8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '16aedd56ebff60f68c28a08d5b99af1eb2ff9ff7458b6a865e63dabda3e39a30'}]}, 'timestamp': '2026-01-22 22:38:09.282343', '_unique_id': 'e0ba1478422941d78e4729e9af0dd754'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.282 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.283 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.283 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.283 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.283 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a415f816-1fa2-471f-a0d5-01871234dfce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.283419', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d661ee-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': 'de847eeb0396f2ef420d88e125d3a884f4ee7011bc25093ae145b13942fff7cc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.283419', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d66964-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': 'c0a080436a9628498611e4ff4d7a785607a3c264ee904d47f4df43454ba5a970'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.283419', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d6729c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '0a6824f335503d6a2141bff513d0bf0708df790d901c83fef2854789d4604c5e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.283419', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d67a08-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '9de65cacfc49fd67a84020113715f7d409acfea1e2c1b097cfe4754bd343cb44'}]}, 'timestamp': '2026-01-22 22:38:09.284238', '_unique_id': 'fa3a25bae3ec418b97a3a84331bac5bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.284 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.285 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.285 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.285 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.285 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82b68935-3b16-49ec-8b4a-a2c62aa837e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-vda', 'timestamp': '2026-01-22T22:38:09.285339', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d6acee-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': '9b9b73c7ef6fedbff16f7b15762996b4a6221edd1de55a45ce87b26c7ff4af8c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d-sda', 'timestamp': '2026-01-22T22:38:09.285339', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'instance-0000007c', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d6b5e0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.774322102, 'message_signature': '6db151e9193e350f5dec2d303802f0e63d0d99c4f22cf162a20dbb578d4cabc1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:38:09.285339', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06d6c026-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '5cafd99fcc2e7d128c7462216213fb5756d080593153d317afd741d488e9e8cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:38:09.285339', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06d6c9ea-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.808371692, 'message_signature': '9381bd9b1e850ca4447d2b80caf596bbd4d2d2c41499570fbfe04945f5ea1cfd'}]}, 'timestamp': '2026-01-22 22:38:09.286317', '_unique_id': '40e101d3636a4b3b9030df3f07c8ce94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.286 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.287 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.287 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '135e13c5-94ac-4067-bc83-b87582fbff87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.287552', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06d7037e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': '9091587b68cca396d70763d67d6fb7526808073214a8b9266d9febf656146b30'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.287552', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06d70c16-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': 'b8376c960f58ab0f3e1923ae390e33043907bf32c5b28624a7f0891fc8f6732f'}]}, 'timestamp': '2026-01-22 22:38:09.288002', '_unique_id': '6adb8f54009a4b5cabc36ab99e661777'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a739fb08-3e7d-4500-95e1-5193e905f994', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.289061', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06d740a0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': 'dcc6458b604452a1b60534c0e78aaf49233e622a0994bdceaa36e483a46bafaa'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.289061', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06d748d4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': '431b2e3ca2d0c612ed83f17cedfaacd8dcc9e9f3c73aa21c3beefd0b5959269f'}]}, 'timestamp': '2026-01-22 22:38:09.289542', '_unique_id': '32c78f216bb2483686f3016f037e7266'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.289 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.290 12 DEBUG ceilometer.compute.pollsters [-] f71aa702-00f6-400b-aa58-458e9e6d6b6d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.290 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0d6134d-1d97-4c34-a5ed-c440a8cc569d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000007c-f71aa702-00f6-400b-aa58-458e9e6d6b6d-tap4db284ba-82', 'timestamp': '2026-01-22T22:38:09.290626', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1598344943', 'name': 'tap4db284ba-82', 'instance_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:9a:60', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4db284ba-82'}, 'message_id': '06d77bc4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.88721027, 'message_signature': '9b8f7c4ca254790122dcce28fd75fee23aac9ffa6e119e14740d11920096d9ad'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:38:09.290626', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '06d78542-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5126.889452526, 'message_signature': '1ccf1bc0f98234492cff95d107a716a748a72d46e42333091ea831aef9fe2a98'}]}, 'timestamp': '2026-01-22 22:38:09.291096', '_unique_id': '0b17a9aeaa624ce0ad4c229570f16544'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:38:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:38:09.291 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:38:09 compute-0 nova_compute[182725]: 2026-01-22 22:38:09.470 182729 DEBUG nova.network.neutron [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Updated VIF entry in instance network info cache for port 4db284ba-8233-4db0-9bf5-367c86c67a4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:38:09 compute-0 nova_compute[182725]: 2026-01-22 22:38:09.471 182729 DEBUG nova.network.neutron [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Updating instance_info_cache with network_info: [{"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:38:09 compute-0 nova_compute[182725]: 2026-01-22 22:38:09.495 182729 DEBUG oslo_concurrency.lockutils [req-9a691006-d583-41c8-bc36-1151cca51c2c req-5a87ee8b-3234-420f-815c-c19457af258e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:38:12 compute-0 nova_compute[182725]: 2026-01-22 22:38:12.230 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:12.446 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:12.447 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:12.448 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:14 compute-0 nova_compute[182725]: 2026-01-22 22:38:14.138 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:15 compute-0 ovn_controller[94850]: 2026-01-22T22:38:15Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:9a:60 10.100.0.6
Jan 22 22:38:15 compute-0 ovn_controller[94850]: 2026-01-22T22:38:15Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:9a:60 10.100.0.6
Jan 22 22:38:16 compute-0 ovn_controller[94850]: 2026-01-22T22:38:16Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:e4:29 10.100.0.8
Jan 22 22:38:16 compute-0 ovn_controller[94850]: 2026-01-22T22:38:16Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:e4:29 10.100.0.8
Jan 22 22:38:17 compute-0 nova_compute[182725]: 2026-01-22 22:38:17.233 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:17 compute-0 nova_compute[182725]: 2026-01-22 22:38:17.921 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:19 compute-0 nova_compute[182725]: 2026-01-22 22:38:19.140 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:19 compute-0 sshd-session[229211]: Connection closed by 45.79.67.44 port 34482 [preauth]
Jan 22 22:38:19 compute-0 sshd-session[229212]: Unable to negotiate with 45.79.67.44 port 34458: no matching host key type found. Their offer: ssh-dss [preauth]
Jan 22 22:38:20 compute-0 sshd-session[229214]: Connection closed by 45.79.67.44 port 34466 [preauth]
Jan 22 22:38:20 compute-0 sshd-session[229215]: Connection closed by 45.79.67.44 port 34472 [preauth]
Jan 22 22:38:20 compute-0 sshd-session[229216]: Unable to negotiate with 45.79.67.44 port 34492: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 22 22:38:20 compute-0 sshd-session[229213]: Unable to negotiate with 45.79.67.44 port 34508: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.236 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.916 182729 INFO nova.compute.manager [None req-5beaaf87-c644-41c5-aa7a-444ce8456e53 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Get console output
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.922 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.933 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.934 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.934 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:22 compute-0 nova_compute[182725]: 2026-01-22 22:38:22.935 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.062 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.153 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.154 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.231 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.238 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.301 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.302 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.386 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.600 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.601 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5315MB free_disk=73.27564239501953GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.601 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.601 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.764 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 254e913f-3968-436b-afcc-e51c2350b232 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.765 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance f71aa702-00f6-400b-aa58-458e9e6d6b6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.765 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.765 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.880 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.899 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.974 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:38:23 compute-0 nova_compute[182725]: 2026-01-22 22:38:23.975 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:38:24 compute-0 nova_compute[182725]: 2026-01-22 22:38:24.143 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:24 compute-0 nova_compute[182725]: 2026-01-22 22:38:24.974 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:24 compute-0 nova_compute[182725]: 2026-01-22 22:38:24.975 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:38:24 compute-0 nova_compute[182725]: 2026-01-22 22:38:24.975 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:38:25 compute-0 podman[229236]: 2026-01-22 22:38:25.166653251 +0000 UTC m=+0.095246918 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:38:27 compute-0 nova_compute[182725]: 2026-01-22 22:38:27.240 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:27 compute-0 nova_compute[182725]: 2026-01-22 22:38:27.988 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:38:27 compute-0 nova_compute[182725]: 2026-01-22 22:38:27.989 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:38:27 compute-0 nova_compute[182725]: 2026-01-22 22:38:27.989 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:38:27 compute-0 nova_compute[182725]: 2026-01-22 22:38:27.990 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 254e913f-3968-436b-afcc-e51c2350b232 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:38:29 compute-0 nova_compute[182725]: 2026-01-22 22:38:29.146 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.076 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updating instance_info_cache with network_info: [{"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.095 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.095 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.096 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.096 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.097 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.097 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:30 compute-0 nova_compute[182725]: 2026-01-22 22:38:30.098 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:38:30 compute-0 podman[229257]: 2026-01-22 22:38:30.134707021 +0000 UTC m=+0.062122762 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 22 22:38:30 compute-0 podman[229256]: 2026-01-22 22:38:30.161716825 +0000 UTC m=+0.092698595 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 22:38:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:31.485 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:38:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:31.486 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:38:31 compute-0 nova_compute[182725]: 2026-01-22 22:38:31.486 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:31 compute-0 nova_compute[182725]: 2026-01-22 22:38:31.893 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:32 compute-0 nova_compute[182725]: 2026-01-22 22:38:32.244 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:38:33.488 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:38:34 compute-0 nova_compute[182725]: 2026-01-22 22:38:34.148 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:34 compute-0 nova_compute[182725]: 2026-01-22 22:38:34.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:38:37 compute-0 nova_compute[182725]: 2026-01-22 22:38:37.247 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:39 compute-0 nova_compute[182725]: 2026-01-22 22:38:39.151 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:40 compute-0 podman[229302]: 2026-01-22 22:38:40.137068253 +0000 UTC m=+0.055615739 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:38:40 compute-0 podman[229303]: 2026-01-22 22:38:40.151808201 +0000 UTC m=+0.066051779 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:38:40 compute-0 podman[229304]: 2026-01-22 22:38:40.157502093 +0000 UTC m=+0.072090410 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:38:40 compute-0 ovn_controller[94850]: 2026-01-22T22:38:40Z|00479|binding|INFO|Releasing lport cb8a3e75-5ba9-4a4a-9966-4ce2b8e6aa9f from this chassis (sb_readonly=0)
Jan 22 22:38:40 compute-0 ovn_controller[94850]: 2026-01-22T22:38:40Z|00480|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 22:38:40 compute-0 nova_compute[182725]: 2026-01-22 22:38:40.512 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:42 compute-0 nova_compute[182725]: 2026-01-22 22:38:42.250 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:44 compute-0 nova_compute[182725]: 2026-01-22 22:38:44.153 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:47 compute-0 nova_compute[182725]: 2026-01-22 22:38:47.254 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:49 compute-0 nova_compute[182725]: 2026-01-22 22:38:49.156 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:52 compute-0 nova_compute[182725]: 2026-01-22 22:38:52.258 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:53 compute-0 nova_compute[182725]: 2026-01-22 22:38:53.111 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:54 compute-0 nova_compute[182725]: 2026-01-22 22:38:54.157 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:56 compute-0 podman[229366]: 2026-01-22 22:38:56.177908367 +0000 UTC m=+0.101610677 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:38:57 compute-0 nova_compute[182725]: 2026-01-22 22:38:57.261 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:57 compute-0 nova_compute[182725]: 2026-01-22 22:38:57.354 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:58 compute-0 nova_compute[182725]: 2026-01-22 22:38:58.849 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:38:59 compute-0 nova_compute[182725]: 2026-01-22 22:38:59.160 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:01 compute-0 podman[229388]: 2026-01-22 22:39:01.127054793 +0000 UTC m=+0.057525437 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 22:39:01 compute-0 podman[229387]: 2026-01-22 22:39:01.166013886 +0000 UTC m=+0.098230913 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.262 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.559 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.560 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.582 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.664 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.665 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.671 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.671 182729 INFO nova.compute.claims [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.710 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "e4a5cd94-28e4-4031-ae49-2527cbacc939" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.711 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.777 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.938 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:02 compute-0 nova_compute[182725]: 2026-01-22 22:39:02.991 182729 DEBUG nova.compute.provider_tree [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.006 182729 DEBUG nova.scheduler.client.report [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.030 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.031 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.036 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.042 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.043 182729 INFO nova.compute.claims [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.124 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.125 182729 DEBUG nova.network.neutron [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.162 182729 INFO nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.182 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.242 182729 DEBUG nova.compute.provider_tree [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.272 182729 DEBUG nova.scheduler.client.report [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.298 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.299 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.306 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.307 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.307 182729 INFO nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Creating image(s)
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.308 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.308 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.309 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.321 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.347 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.348 182729 DEBUG nova.network.neutron [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.371 182729 INFO nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.376 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.377 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.377 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.389 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.407 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.445 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.445 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.464 182729 DEBUG nova.policy [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '80fc173d19874dafa5e0cbd18c7ccf24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839eb51e89b14157b8da40ae1b480ef3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.476 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.477 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.477 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.526 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.528 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.529 182729 INFO nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Creating image(s)
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.529 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "/var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.530 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.530 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.541 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.542 182729 DEBUG nova.virt.disk.api [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.542 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.562 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.618 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.619 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.620 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.631 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.648 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.649 182729 DEBUG nova.virt.disk.api [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.649 182729 DEBUG nova.objects.instance [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.662 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.662 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Ensure instance console log exists: /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.663 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.663 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.663 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.684 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.684 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.715 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.716 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.716 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.735 182729 DEBUG nova.policy [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.772 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.773 182729 DEBUG nova.virt.disk.api [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Checking if we can resize image /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.773 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.831 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.832 182729 DEBUG nova.virt.disk.api [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Cannot resize image /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.832 182729 DEBUG nova.objects.instance [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'migration_context' on Instance uuid e4a5cd94-28e4-4031-ae49-2527cbacc939 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.846 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.846 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Ensure instance console log exists: /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.847 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.847 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:03 compute-0 nova_compute[182725]: 2026-01-22 22:39:03.847 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:04 compute-0 nova_compute[182725]: 2026-01-22 22:39:04.163 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:04 compute-0 nova_compute[182725]: 2026-01-22 22:39:04.698 182729 DEBUG nova.network.neutron [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Successfully created port: 9a6dc28c-828d-435c-b619-7c51693137c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:39:04 compute-0 nova_compute[182725]: 2026-01-22 22:39:04.820 182729 DEBUG nova.network.neutron [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Successfully created port: d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.052 182729 DEBUG nova.network.neutron [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Successfully updated port: 9a6dc28c-828d-435c-b619-7c51693137c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.074 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.074 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.074 182729 DEBUG nova.network.neutron [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.115 182729 DEBUG nova.network.neutron [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Successfully updated port: d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.132 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.132 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.133 182729 DEBUG nova.network.neutron [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.475 182729 DEBUG nova.compute.manager [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-changed-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.476 182729 DEBUG nova.compute.manager [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Refreshing instance network info cache due to event network-changed-9a6dc28c-828d-435c-b619-7c51693137c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.476 182729 DEBUG oslo_concurrency.lockutils [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.564 182729 DEBUG nova.network.neutron [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:39:06 compute-0 nova_compute[182725]: 2026-01-22 22:39:06.571 182729 DEBUG nova.network.neutron [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:39:07 compute-0 nova_compute[182725]: 2026-01-22 22:39:07.265 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.075 182729 DEBUG nova.network.neutron [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updating instance_info_cache with network_info: [{"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.101 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.102 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance network_info: |[{"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.108 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Start _get_guest_xml network_info=[{"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.114 182729 WARNING nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.119 182729 DEBUG nova.virt.libvirt.host [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.120 182729 DEBUG nova.virt.libvirt.host [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.123 182729 DEBUG nova.virt.libvirt.host [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.123 182729 DEBUG nova.virt.libvirt.host [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.125 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.125 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.126 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.126 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.126 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.126 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.127 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.127 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.127 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.127 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.128 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.128 182729 DEBUG nova.virt.hardware [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.132 182729 DEBUG nova.virt.libvirt.vif [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1471061737',display_name='tempest-ServerActionsTestOtherB-server-1471061737',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1471061737',id=129,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFkqkcnPVXoYCjIgI5bw5OA6/5nym1rzZZURq62sf76ZpC5y2dgqZ39wG5JFuphS0Mujaf51N2ioOXSv8BTIWm028Sgb05TqNV6DDbykFp1jT1uEcdV7QMSeYi3Dtxoog==',key_name='tempest-keypair-483211252',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-m8y1mmcu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8b15fdf3e23640a2b9579790941bb346',uuid=e4a5cd94-28e4-4031-ae49-2527cbacc939,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.132 182729 DEBUG nova.network.os_vif_util [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.133 182729 DEBUG nova.network.os_vif_util [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:50:f1,bridge_name='br-int',has_traffic_filtering=True,id=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1f07ed0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.134 182729 DEBUG nova.objects.instance [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'pci_devices' on Instance uuid e4a5cd94-28e4-4031-ae49-2527cbacc939 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.149 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <uuid>e4a5cd94-28e4-4031-ae49-2527cbacc939</uuid>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <name>instance-00000081</name>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerActionsTestOtherB-server-1471061737</nova:name>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:39:08</nova:creationTime>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:user uuid="8b15fdf3e23640a2b9579790941bb346">tempest-ServerActionsTestOtherB-1598778832-project-member</nova:user>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:project uuid="abdd987d004046138277253df8658aca">tempest-ServerActionsTestOtherB-1598778832</nova:project>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:port uuid="d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb">
Jan 22 22:39:08 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <system>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="serial">e4a5cd94-28e4-4031-ae49-2527cbacc939</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="uuid">e4a5cd94-28e4-4031-ae49-2527cbacc939</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </system>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <os>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </os>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <features>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </features>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.config"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:19:50:f1"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <target dev="tapd1f07ed0-8f"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/console.log" append="off"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <video>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </video>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:39:08 compute-0 nova_compute[182725]: </domain>
Jan 22 22:39:08 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.150 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Preparing to wait for external event network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.150 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.150 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.150 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.151 182729 DEBUG nova.virt.libvirt.vif [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1471061737',display_name='tempest-ServerActionsTestOtherB-server-1471061737',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1471061737',id=129,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFkqkcnPVXoYCjIgI5bw5OA6/5nym1rzZZURq62sf76ZpC5y2dgqZ39wG5JFuphS0Mujaf51N2ioOXSv8BTIWm028Sgb05TqNV6DDbykFp1jT1uEcdV7QMSeYi3Dtxoog==',key_name='tempest-keypair-483211252',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-m8y1mmcu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8b15fdf3e23640a2b9579790941bb346',uuid=e4a5cd94-28e4-4031-ae49-2527cbacc939,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.151 182729 DEBUG nova.network.os_vif_util [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.152 182729 DEBUG nova.network.os_vif_util [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:50:f1,bridge_name='br-int',has_traffic_filtering=True,id=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1f07ed0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.152 182729 DEBUG os_vif [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:50:f1,bridge_name='br-int',has_traffic_filtering=True,id=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1f07ed0-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.153 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.153 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.154 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.157 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.157 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f07ed0-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.158 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1f07ed0-8f, col_values=(('external_ids', {'iface-id': 'd1f07ed0-8f5e-407c-9e9b-74ba338aa8eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:50:f1', 'vm-uuid': 'e4a5cd94-28e4-4031-ae49-2527cbacc939'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.160 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 NetworkManager[54954]: <info>  [1769121548.1610] manager: (tapd1f07ed0-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.161 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.173 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.174 182729 INFO os_vif [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:50:f1,bridge_name='br-int',has_traffic_filtering=True,id=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1f07ed0-8f')
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.223 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.224 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.224 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No VIF found with MAC fa:16:3e:19:50:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.225 182729 INFO nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Using config drive
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.234 182729 DEBUG nova.network.neutron [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updating instance_info_cache with network_info: [{"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.254 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.255 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance network_info: |[{"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.255 182729 DEBUG oslo_concurrency.lockutils [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.255 182729 DEBUG nova.network.neutron [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Refreshing network info cache for port 9a6dc28c-828d-435c-b619-7c51693137c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.257 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Start _get_guest_xml network_info=[{"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.261 182729 WARNING nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.266 182729 DEBUG nova.virt.libvirt.host [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.266 182729 DEBUG nova.virt.libvirt.host [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.271 182729 DEBUG nova.virt.libvirt.host [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.271 182729 DEBUG nova.virt.libvirt.host [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.272 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.272 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.272 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.272 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.272 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.273 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.273 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.273 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.273 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.273 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.273 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.274 182729 DEBUG nova.virt.hardware [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.276 182729 DEBUG nova.virt.libvirt.vif [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-920730235',display_name='tempest-TestNetworkAdvancedServerOps-server-920730235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-920730235',id=128,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKipAECYwv/kwfCJ76CvMOtJresIZaDLlyUkiFxcTlYjbAX4511FUkSKueMkA0cQfc3M7mxwBESGCBPari0ZsqICW2HR5vgwqLD0i+BZnu1BuFVZ3D4TfH+oikr45N3Ctg==',key_name='tempest-TestNetworkAdvancedServerOps-822496345',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-dma0x6tw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:03Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.276 182729 DEBUG nova.network.os_vif_util [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.277 182729 DEBUG nova.network.os_vif_util [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:f6:e8,bridge_name='br-int',has_traffic_filtering=True,id=9a6dc28c-828d-435c-b619-7c51693137c4,network=Network(7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a6dc28c-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.278 182729 DEBUG nova.objects.instance [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.290 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <uuid>dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f</uuid>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <name>instance-00000080</name>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-920730235</nova:name>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:39:08</nova:creationTime>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         <nova:port uuid="9a6dc28c-828d-435c-b619-7c51693137c4">
Jan 22 22:39:08 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <system>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="serial">dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="uuid">dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </system>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <os>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </os>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <features>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </features>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk.config"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:2d:f6:e8"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <target dev="tap9a6dc28c-82"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/console.log" append="off"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <video>
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </video>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:39:08 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:39:08 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:39:08 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:39:08 compute-0 nova_compute[182725]: </domain>
Jan 22 22:39:08 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.291 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Preparing to wait for external event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.291 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.291 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.291 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.292 182729 DEBUG nova.virt.libvirt.vif [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-920730235',display_name='tempest-TestNetworkAdvancedServerOps-server-920730235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-920730235',id=128,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKipAECYwv/kwfCJ76CvMOtJresIZaDLlyUkiFxcTlYjbAX4511FUkSKueMkA0cQfc3M7mxwBESGCBPari0ZsqICW2HR5vgwqLD0i+BZnu1BuFVZ3D4TfH+oikr45N3Ctg==',key_name='tempest-TestNetworkAdvancedServerOps-822496345',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-dma0x6tw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:03Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.292 182729 DEBUG nova.network.os_vif_util [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.292 182729 DEBUG nova.network.os_vif_util [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:f6:e8,bridge_name='br-int',has_traffic_filtering=True,id=9a6dc28c-828d-435c-b619-7c51693137c4,network=Network(7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a6dc28c-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.293 182729 DEBUG os_vif [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:f6:e8,bridge_name='br-int',has_traffic_filtering=True,id=9a6dc28c-828d-435c-b619-7c51693137c4,network=Network(7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a6dc28c-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.293 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.293 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.294 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.296 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.296 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a6dc28c-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.296 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9a6dc28c-82, col_values=(('external_ids', {'iface-id': '9a6dc28c-828d-435c-b619-7c51693137c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:f6:e8', 'vm-uuid': 'dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.297 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 NetworkManager[54954]: <info>  [1769121548.2989] manager: (tap9a6dc28c-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.300 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.305 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.306 182729 INFO os_vif [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:f6:e8,bridge_name='br-int',has_traffic_filtering=True,id=9a6dc28c-828d-435c-b619-7c51693137c4,network=Network(7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a6dc28c-82')
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.350 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.350 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.350 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No VIF found with MAC fa:16:3e:2d:f6:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.351 182729 INFO nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Using config drive
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.585 182729 DEBUG nova.compute.manager [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received event network-changed-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.585 182729 DEBUG nova.compute.manager [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Refreshing instance network info cache due to event network-changed-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.585 182729 DEBUG oslo_concurrency.lockutils [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.585 182729 DEBUG oslo_concurrency.lockutils [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.586 182729 DEBUG nova.network.neutron [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Refreshing network info cache for port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.814 182729 INFO nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Creating config drive at /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.config
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.821 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvtnj19nf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.922 182729 INFO nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Creating config drive at /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk.config
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.927 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4np_2cvf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:08 compute-0 nova_compute[182725]: 2026-01-22 22:39:08.956 182729 DEBUG oslo_concurrency.processutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvtnj19nf" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:09 compute-0 kernel: tapd1f07ed0-8f: entered promiscuous mode
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.0157] manager: (tapd1f07ed0-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/229)
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00481|binding|INFO|Claiming lport d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb for this chassis.
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00482|binding|INFO|d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb: Claiming fa:16:3e:19:50:f1 10.100.0.10
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.018 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00483|binding|INFO|Setting lport d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb ovn-installed in OVS
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.036 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00484|binding|INFO|Setting lport d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb up in Southbound
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.041 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:50:f1 10.100.0.10'], port_security=['fa:16:3e:19:50:f1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b9a45c4-3bd4-4f5f-b26b-5b1ab95bdd58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.042 104215 INFO neutron.agent.ovn.metadata.agent [-] Port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 bound to our chassis
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.043 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.043 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 22:39:09 compute-0 systemd-udevd[229489]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.051 182729 DEBUG oslo_concurrency.processutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4np_2cvf" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.0593] device (tapd1f07ed0-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.058 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8e0d1e85-4592-4756-97df-4884b6c92eb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.0600] device (tapd1f07ed0-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:39:09 compute-0 systemd-machined[154006]: New machine qemu-57-instance-00000081.
Jan 22 22:39:09 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-00000081.
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.095 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6430edcb-3d63-465b-8acb-3bb1a76f0242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.097 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf33e78-08c7-4932-a9e9-9d6aa8225b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 kernel: tap9a6dc28c-82: entered promiscuous mode
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.1208] manager: (tap9a6dc28c-82): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00485|binding|INFO|Claiming lport 9a6dc28c-828d-435c-b619-7c51693137c4 for this chassis.
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00486|binding|INFO|9a6dc28c-828d-435c-b619-7c51693137c4: Claiming fa:16:3e:2d:f6:e8 10.100.0.14
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.122 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.131 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:f6:e8 10.100.0.14'], port_security=['fa:16:3e:2d:f6:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1f30876-5cd8-4e98-8834-4f70a79dfb84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec375847-f056-44d3-9d58-90e08d020586, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9a6dc28c-828d-435c-b619-7c51693137c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.1320] device (tap9a6dc28c-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.1328] device (tap9a6dc28c-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.132 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[cd74b589-8f23-4623-8fed-dc4e647586e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00487|binding|INFO|Setting lport 9a6dc28c-828d-435c-b619-7c51693137c4 ovn-installed in OVS
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00488|binding|INFO|Setting lport 9a6dc28c-828d-435c-b619-7c51693137c4 up in Southbound
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.141 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.147 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.154 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d8929768-4e9d-4b4a-ac28-fa456f07f5d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512189, 'reachable_time': 16169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229516, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.165 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 systemd-machined[154006]: New machine qemu-58-instance-00000080.
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.169 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4183d8-96c1-4092-bb3f-542d436584c1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap84d8b010-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512202, 'tstamp': 512202}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229519, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap84d8b010-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512205, 'tstamp': 512205}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229519, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.171 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.172 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.173 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.174 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84d8b010-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.174 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.175 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84d8b010-d0, col_values=(('external_ids', {'iface-id': '8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.175 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.177 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9a6dc28c-828d-435c-b619-7c51693137c4 in datapath 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 unbound from our chassis
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.178 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9
Jan 22 22:39:09 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-00000080.
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.188 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[92707937-158e-4457-b522-792df6c1709c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.189 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7a53a2bd-51 in ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.191 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7a53a2bd-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.191 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[354adcc2-8607-4e63-ae69-748ae5fd547e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.192 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[841eaa22-29bf-4fb2-a0ec-b5c73cf13ca1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.204 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[e3454355-0d9e-43c4-a025-d697d2f27f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.221 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b5aaf7a5-918f-4a6c-a159-3de1910767cb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.253 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[bee2835c-c59b-49d0-9e49-ac3f2c6369f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.2612] manager: (tap7a53a2bd-50): new Veth device (/org/freedesktop/NetworkManager/Devices/231)
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.260 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d977608e-d146-4cc2-ac62-05e82a92844f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.308 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[69f92285-bc61-4357-8586-6962bb7530de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.311 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f5883b-1962-4b6c-a35f-f05db258876c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.3345] device (tap7a53a2bd-50): carrier: link connected
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.341 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[54a86a55-59aa-448d-bb41-2229fb55d7b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.349 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121549.349119, e4a5cd94-28e4-4031-ae49-2527cbacc939 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.350 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] VM Started (Lifecycle Event)
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.362 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[111a7aa9-233d-4825-ad4b-742294ecd966]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a53a2bd-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:68:a6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518693, 'reachable_time': 30997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229561, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.377 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b091d468-b422-48d4-bd62-60c824a96cbc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe60:68a6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518693, 'tstamp': 518693}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229562, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.394 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.398 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1b953e8d-7991-4925-b362-f090c7f55d78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a53a2bd-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:68:a6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518693, 'reachable_time': 30997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229563, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.402 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121549.3493438, e4a5cd94-28e4-4031-ae49-2527cbacc939 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.402 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] VM Paused (Lifecycle Event)
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.425 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.428 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.437 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[09be2bc5-72d7-4e10-9c53-d716f27b6f0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.453 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.489 182729 DEBUG nova.compute.manager [req-9281f46f-f72e-490f-8e8e-80ebb488c7ad req-97c271a0-ceb2-4fd6-8ccb-c46455e0397c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.490 182729 DEBUG oslo_concurrency.lockutils [req-9281f46f-f72e-490f-8e8e-80ebb488c7ad req-97c271a0-ceb2-4fd6-8ccb-c46455e0397c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.490 182729 DEBUG oslo_concurrency.lockutils [req-9281f46f-f72e-490f-8e8e-80ebb488c7ad req-97c271a0-ceb2-4fd6-8ccb-c46455e0397c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.490 182729 DEBUG oslo_concurrency.lockutils [req-9281f46f-f72e-490f-8e8e-80ebb488c7ad req-97c271a0-ceb2-4fd6-8ccb-c46455e0397c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.491 182729 DEBUG nova.compute.manager [req-9281f46f-f72e-490f-8e8e-80ebb488c7ad req-97c271a0-ceb2-4fd6-8ccb-c46455e0397c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Processing event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.498 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[04b6413e-7c6c-4f9f-8125-46112a745eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.500 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a53a2bd-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.500 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.500 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a53a2bd-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.502 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 NetworkManager[54954]: <info>  [1769121549.5030] manager: (tap7a53a2bd-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Jan 22 22:39:09 compute-0 kernel: tap7a53a2bd-50: entered promiscuous mode
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.505 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a53a2bd-50, col_values=(('external_ids', {'iface-id': 'b9ddfa27-d43e-4650-8846-a57a89114a49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.506 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_controller[94850]: 2026-01-22T22:39:09Z|00489|binding|INFO|Releasing lport b9ddfa27-d43e-4650-8846-a57a89114a49 from this chassis (sb_readonly=0)
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.518 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.520 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.521 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.522 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ce966f-6db2-4fe8-b10f-7a22569b104c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.522 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.pid.haproxy
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:39:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:09.523 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'env', 'PROCESS_TAG=haproxy-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.608 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121549.6076758, dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.608 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] VM Started (Lifecycle Event)
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.610 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.614 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.618 182729 INFO nova.virt.libvirt.driver [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance spawned successfully.
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.618 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.630 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.633 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.651 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.651 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.652 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.652 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.653 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.653 182729 DEBUG nova.virt.libvirt.driver [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.658 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.658 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121549.6077807, dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.658 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] VM Paused (Lifecycle Event)
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.700 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.705 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121549.61336, dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.706 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] VM Resumed (Lifecycle Event)
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.729 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.732 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.770 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.796 182729 INFO nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Took 6.49 seconds to spawn the instance on the hypervisor.
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.797 182729 DEBUG nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.911 182729 INFO nova.compute.manager [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Took 7.28 seconds to build instance.
Jan 22 22:39:09 compute-0 podman[229601]: 2026-01-22 22:39:09.947066441 +0000 UTC m=+0.072899311 container create 43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.953 182729 DEBUG oslo_concurrency.lockutils [None req-6bb24f9c-2948-440b-8258-ec9a1f2c5dee 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.393s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.969 182729 DEBUG nova.network.neutron [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updated VIF entry in instance network info cache for port 9a6dc28c-828d-435c-b619-7c51693137c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.970 182729 DEBUG nova.network.neutron [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updating instance_info_cache with network_info: [{"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:09 compute-0 systemd[1]: Started libpod-conmon-43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0.scope.
Jan 22 22:39:09 compute-0 nova_compute[182725]: 2026-01-22 22:39:09.995 182729 DEBUG oslo_concurrency.lockutils [req-46fd6a50-4285-4007-9e14-480e782c03db req-e7e56480-cb00-45f0-8491-296250b73279 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:10 compute-0 podman[229601]: 2026-01-22 22:39:09.907052862 +0000 UTC m=+0.032885742 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:39:10 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:39:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a30a713fece18f4e45710de58c602e4dace33a2f92854cdbb64a5ce891d4d33a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:39:10 compute-0 podman[229601]: 2026-01-22 22:39:10.058548594 +0000 UTC m=+0.184381564 container init 43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 22:39:10 compute-0 podman[229601]: 2026-01-22 22:39:10.070323708 +0000 UTC m=+0.196156608 container start 43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:39:10 compute-0 nova_compute[182725]: 2026-01-22 22:39:10.107 182729 DEBUG nova.network.neutron [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updated VIF entry in instance network info cache for port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:39:10 compute-0 nova_compute[182725]: 2026-01-22 22:39:10.108 182729 DEBUG nova.network.neutron [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updating instance_info_cache with network_info: [{"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:10 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[229617]: [NOTICE]   (229621) : New worker (229623) forked
Jan 22 22:39:10 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[229617]: [NOTICE]   (229621) : Loading success.
Jan 22 22:39:10 compute-0 nova_compute[182725]: 2026-01-22 22:39:10.123 182729 DEBUG oslo_concurrency.lockutils [req-f785a50f-9473-4d92-b981-8ad46fcf24e0 req-705c2954-d0fb-489a-9e14-3c3b47e77178 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.133 182729 DEBUG nova.compute.manager [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received event network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.134 182729 DEBUG oslo_concurrency.lockutils [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.134 182729 DEBUG oslo_concurrency.lockutils [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.135 182729 DEBUG oslo_concurrency.lockutils [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.135 182729 DEBUG nova.compute.manager [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Processing event network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.136 182729 DEBUG nova.compute.manager [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received event network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.136 182729 DEBUG oslo_concurrency.lockutils [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.137 182729 DEBUG oslo_concurrency.lockutils [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.137 182729 DEBUG oslo_concurrency.lockutils [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.137 182729 DEBUG nova.compute.manager [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] No waiting events found dispatching network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.138 182729 WARNING nova.compute.manager [req-55c1d480-63b8-4569-9b60-f8d7d48b395b req-90ae0d2b-8d70-49ce-9be4-b0e08aa49ae0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received unexpected event network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb for instance with vm_state building and task_state spawning.
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.139 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.143 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121551.142897, e4a5cd94-28e4-4031-ae49-2527cbacc939 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.143 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] VM Resumed (Lifecycle Event)
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.145 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.149 182729 INFO nova.virt.libvirt.driver [-] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance spawned successfully.
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.150 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:39:11 compute-0 podman[229632]: 2026-01-22 22:39:11.152833469 +0000 UTC m=+0.069086716 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.164 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.171 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.176 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.176 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.177 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.177 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.178 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.179 182729 DEBUG nova.virt.libvirt.driver [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:11 compute-0 podman[229634]: 2026-01-22 22:39:11.180067979 +0000 UTC m=+0.089847684 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:39:11 compute-0 podman[229633]: 2026-01-22 22:39:11.180032068 +0000 UTC m=+0.096365467 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.206 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.245 182729 INFO nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Took 7.72 seconds to spawn the instance on the hypervisor.
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.246 182729 DEBUG nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.339 182729 INFO nova.compute.manager [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Took 8.45 seconds to build instance.
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.359 182729 DEBUG oslo_concurrency.lockutils [None req-3c13ec91-d0dd-4b93-bdad-d8ef0d32e1aa 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.375 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:11 compute-0 ovn_controller[94850]: 2026-01-22T22:39:11Z|00490|binding|INFO|Releasing lport cb8a3e75-5ba9-4a4a-9966-4ce2b8e6aa9f from this chassis (sb_readonly=0)
Jan 22 22:39:11 compute-0 ovn_controller[94850]: 2026-01-22T22:39:11Z|00491|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 22:39:11 compute-0 ovn_controller[94850]: 2026-01-22T22:39:11Z|00492|binding|INFO|Releasing lport b9ddfa27-d43e-4650-8846-a57a89114a49 from this chassis (sb_readonly=0)
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.493 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.617 182729 DEBUG nova.compute.manager [req-ce035e93-08f9-4a9b-8711-b45da4803722 req-d347443b-74c9-41c0-8fcb-0c81d954288c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.617 182729 DEBUG oslo_concurrency.lockutils [req-ce035e93-08f9-4a9b-8711-b45da4803722 req-d347443b-74c9-41c0-8fcb-0c81d954288c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.618 182729 DEBUG oslo_concurrency.lockutils [req-ce035e93-08f9-4a9b-8711-b45da4803722 req-d347443b-74c9-41c0-8fcb-0c81d954288c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.618 182729 DEBUG oslo_concurrency.lockutils [req-ce035e93-08f9-4a9b-8711-b45da4803722 req-d347443b-74c9-41c0-8fcb-0c81d954288c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.618 182729 DEBUG nova.compute.manager [req-ce035e93-08f9-4a9b-8711-b45da4803722 req-d347443b-74c9-41c0-8fcb-0c81d954288c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] No waiting events found dispatching network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:11 compute-0 nova_compute[182725]: 2026-01-22 22:39:11.619 182729 WARNING nova.compute.manager [req-ce035e93-08f9-4a9b-8711-b45da4803722 req-d347443b-74c9-41c0-8fcb-0c81d954288c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received unexpected event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 for instance with vm_state active and task_state None.
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.424 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.424 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.424 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.424 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.425 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.434 182729 INFO nova.compute.manager [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Terminating instance
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.447 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.448 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.449 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.449 182729 DEBUG nova.compute.manager [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:39:12 compute-0 kernel: tap4db284ba-82 (unregistering): left promiscuous mode
Jan 22 22:39:12 compute-0 NetworkManager[54954]: <info>  [1769121552.4736] device (tap4db284ba-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:39:12 compute-0 ovn_controller[94850]: 2026-01-22T22:39:12Z|00493|binding|INFO|Releasing lport 4db284ba-8233-4db0-9bf5-367c86c67a4e from this chassis (sb_readonly=0)
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.488 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:12 compute-0 ovn_controller[94850]: 2026-01-22T22:39:12Z|00494|binding|INFO|Setting lport 4db284ba-8233-4db0-9bf5-367c86c67a4e down in Southbound
Jan 22 22:39:12 compute-0 ovn_controller[94850]: 2026-01-22T22:39:12Z|00495|binding|INFO|Removing iface tap4db284ba-82 ovn-installed in OVS
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.491 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.497 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:9a:60 10.100.0.6'], port_security=['fa:16:3e:34:9a:60 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f71aa702-00f6-400b-aa58-458e9e6d6b6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-faab65d1-5c71-4952-afd3-9e2ee1603831', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3206d97c-4936-4b4c-81d9-27b6ef63ed2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fce7f5f4-510e-4886-b82a-0e9c0af2a5e1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=4db284ba-8233-4db0-9bf5-367c86c67a4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.498 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 4db284ba-8233-4db0-9bf5-367c86c67a4e in datapath faab65d1-5c71-4952-afd3-9e2ee1603831 unbound from our chassis
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.500 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network faab65d1-5c71-4952-afd3-9e2ee1603831, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.504 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ac5b525d-022a-4342-b6de-31add0e5db15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.505 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831 namespace which is not needed anymore
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.515 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:12 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Jan 22 22:39:12 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000007c.scope: Consumed 14.657s CPU time.
Jan 22 22:39:12 compute-0 systemd-machined[154006]: Machine qemu-55-instance-0000007c terminated.
Jan 22 22:39:12 compute-0 neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831[228999]: [NOTICE]   (229003) : haproxy version is 2.8.14-c23fe91
Jan 22 22:39:12 compute-0 neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831[228999]: [NOTICE]   (229003) : path to executable is /usr/sbin/haproxy
Jan 22 22:39:12 compute-0 neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831[228999]: [WARNING]  (229003) : Exiting Master process...
Jan 22 22:39:12 compute-0 neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831[228999]: [ALERT]    (229003) : Current worker (229005) exited with code 143 (Terminated)
Jan 22 22:39:12 compute-0 neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831[228999]: [WARNING]  (229003) : All workers exited. Exiting... (0)
Jan 22 22:39:12 compute-0 systemd[1]: libpod-7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4.scope: Deactivated successfully.
Jan 22 22:39:12 compute-0 conmon[228999]: conmon 7103e4150ac030cf2940 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4.scope/container/memory.events
Jan 22 22:39:12 compute-0 podman[229722]: 2026-01-22 22:39:12.658519962 +0000 UTC m=+0.054386139 container died 7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:39:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4-userdata-shm.mount: Deactivated successfully.
Jan 22 22:39:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ccec6a7d11187e337db9eaccc026dc396c3cdebbe2c0e64a24c74ed803d4472-merged.mount: Deactivated successfully.
Jan 22 22:39:12 compute-0 podman[229722]: 2026-01-22 22:39:12.721951765 +0000 UTC m=+0.117817932 container cleanup 7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.725 182729 INFO nova.virt.libvirt.driver [-] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Instance destroyed successfully.
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.726 182729 DEBUG nova.objects.instance [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid f71aa702-00f6-400b-aa58-458e9e6d6b6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:12 compute-0 systemd[1]: libpod-conmon-7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4.scope: Deactivated successfully.
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.756 182729 DEBUG nova.virt.libvirt.vif [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:37:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1598344943',display_name='tempest-TestNetworkBasicOps-server-1598344943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1598344943',id=124,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLY2vyl2wIGtoXWLaNeiGl2aNy8WzaO6IAdUibsEvR5qs8jDPbHCYPSLN+Zm5D1XcPbSC7cz/epYXZRIoDeWrQZEwZQpbXmA2aITKetpp9p9rogfzv5DRlF5GF9fOv5A0Q==',key_name='tempest-TestNetworkBasicOps-1464806580',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:38:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-a2tmt2r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:38:03Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=f71aa702-00f6-400b-aa58-458e9e6d6b6d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.756 182729 DEBUG nova.network.os_vif_util [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "address": "fa:16:3e:34:9a:60", "network": {"id": "faab65d1-5c71-4952-afd3-9e2ee1603831", "bridge": "br-int", "label": "tempest-network-smoke--1808800600", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db284ba-82", "ovs_interfaceid": "4db284ba-8233-4db0-9bf5-367c86c67a4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.757 182729 DEBUG nova.network.os_vif_util [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:9a:60,bridge_name='br-int',has_traffic_filtering=True,id=4db284ba-8233-4db0-9bf5-367c86c67a4e,network=Network(faab65d1-5c71-4952-afd3-9e2ee1603831),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db284ba-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.757 182729 DEBUG os_vif [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:9a:60,bridge_name='br-int',has_traffic_filtering=True,id=4db284ba-8233-4db0-9bf5-367c86c67a4e,network=Network(faab65d1-5c71-4952-afd3-9e2ee1603831),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db284ba-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.761 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.762 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4db284ba-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.771 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.776 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.780 182729 INFO os_vif [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:9a:60,bridge_name='br-int',has_traffic_filtering=True,id=4db284ba-8233-4db0-9bf5-367c86c67a4e,network=Network(faab65d1-5c71-4952-afd3-9e2ee1603831),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db284ba-82')
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.783 182729 INFO nova.virt.libvirt.driver [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Deleting instance files /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d_del
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.785 182729 INFO nova.virt.libvirt.driver [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Deletion of /var/lib/nova/instances/f71aa702-00f6-400b-aa58-458e9e6d6b6d_del complete
Jan 22 22:39:12 compute-0 podman[229768]: 2026-01-22 22:39:12.793880661 +0000 UTC m=+0.046217915 container remove 7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.800 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5934fca-9ff0-4cdf-8591-8e6b615dfa13]: (4, ('Thu Jan 22 10:39:12 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831 (7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4)\n7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4\nThu Jan 22 10:39:12 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831 (7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4)\n7103e4150ac030cf2940824d2fe60a419aaf436ab0378cc681490e692af1eef4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.803 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8c90a2d3-9d4f-4512-bef3-35d4daa73f67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.805 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfaab65d1-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:12 compute-0 kernel: tapfaab65d1-50: left promiscuous mode
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.814 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:12 compute-0 nova_compute[182725]: 2026-01-22 22:39:12.818 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.822 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5015de7d-65b3-46b9-9d67-2b869721e893]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.842 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[20640ba0-6a24-4ce0-b209-4b892437f405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.844 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[db060681-6676-4f99-a8c1-917340398c65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.868 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e54cbed2-a81a-4885-a46a-cc6f7fc8cb29]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512092, 'reachable_time': 30334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229784, 'error': None, 'target': 'ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:12 compute-0 systemd[1]: run-netns-ovnmeta\x2dfaab65d1\x2d5c71\x2d4952\x2dafd3\x2d9e2ee1603831.mount: Deactivated successfully.
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.873 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-faab65d1-5c71-4952-afd3-9e2ee1603831 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:39:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:12.874 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a95054db-887f-4711-9867-2eb8514c696e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.273 182729 INFO nova.compute.manager [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Took 0.82 seconds to destroy the instance on the hypervisor.
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.274 182729 DEBUG oslo.service.loopingcall [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.274 182729 DEBUG nova.compute.manager [-] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.274 182729 DEBUG nova.network.neutron [-] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.747 182729 DEBUG nova.compute.manager [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-changed-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.748 182729 DEBUG nova.compute.manager [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Refreshing instance network info cache due to event network-changed-4db284ba-8233-4db0-9bf5-367c86c67a4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.749 182729 DEBUG oslo_concurrency.lockutils [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.749 182729 DEBUG oslo_concurrency.lockutils [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.750 182729 DEBUG nova.network.neutron [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Refreshing network info cache for port 4db284ba-8233-4db0-9bf5-367c86c67a4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.964 182729 DEBUG nova.compute.manager [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-changed-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.965 182729 DEBUG nova.compute.manager [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Refreshing instance network info cache due to event network-changed-9a6dc28c-828d-435c-b619-7c51693137c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.965 182729 DEBUG oslo_concurrency.lockutils [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.965 182729 DEBUG oslo_concurrency.lockutils [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:13 compute-0 nova_compute[182725]: 2026-01-22 22:39:13.966 182729 DEBUG nova.network.neutron [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Refreshing network info cache for port 9a6dc28c-828d-435c-b619-7c51693137c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.069 182729 DEBUG nova.network.neutron [-] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.101 182729 INFO nova.compute.manager [-] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Took 0.83 seconds to deallocate network for instance.
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.130 182729 INFO nova.network.neutron [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Port 4db284ba-8233-4db0-9bf5-367c86c67a4e from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.131 182729 DEBUG nova.network.neutron [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.145 182729 DEBUG nova.compute.manager [req-627c6525-658d-4863-8f14-887c4c50ad5a req-37e68e86-c5ef-4ac5-9ea4-f8bb7ad10054 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-vif-deleted-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.159 182729 DEBUG oslo_concurrency.lockutils [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f71aa702-00f6-400b-aa58-458e9e6d6b6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.159 182729 DEBUG nova.compute.manager [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received event network-changed-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.159 182729 DEBUG nova.compute.manager [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Refreshing instance network info cache due to event network-changed-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.160 182729 DEBUG oslo_concurrency.lockutils [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.160 182729 DEBUG oslo_concurrency.lockutils [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.160 182729 DEBUG nova.network.neutron [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Refreshing network info cache for port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.166 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.217 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.218 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.657 182729 DEBUG nova.compute.provider_tree [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.672 182729 DEBUG nova.scheduler.client.report [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.692 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.783 182729 INFO nova.scheduler.client.report [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance f71aa702-00f6-400b-aa58-458e9e6d6b6d
Jan 22 22:39:14 compute-0 nova_compute[182725]: 2026-01-22 22:39:14.875 182729 DEBUG oslo_concurrency.lockutils [None req-b10e0a9a-7c40-407b-b661-9a41e97d2726 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:15 compute-0 nova_compute[182725]: 2026-01-22 22:39:15.501 182729 DEBUG nova.network.neutron [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updated VIF entry in instance network info cache for port 9a6dc28c-828d-435c-b619-7c51693137c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:39:15 compute-0 nova_compute[182725]: 2026-01-22 22:39:15.501 182729 DEBUG nova.network.neutron [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updating instance_info_cache with network_info: [{"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:15 compute-0 nova_compute[182725]: 2026-01-22 22:39:15.538 182729 DEBUG oslo_concurrency.lockutils [req-000399fc-bccd-4eb0-ba8c-2e83c16a1d82 req-49fdcedf-d1c4-4dcc-86de-b3e286734657 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.088 182729 DEBUG nova.compute.manager [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-vif-unplugged-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.089 182729 DEBUG oslo_concurrency.lockutils [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.089 182729 DEBUG oslo_concurrency.lockutils [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.089 182729 DEBUG oslo_concurrency.lockutils [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.089 182729 DEBUG nova.compute.manager [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] No waiting events found dispatching network-vif-unplugged-4db284ba-8233-4db0-9bf5-367c86c67a4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.090 182729 WARNING nova.compute.manager [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received unexpected event network-vif-unplugged-4db284ba-8233-4db0-9bf5-367c86c67a4e for instance with vm_state deleted and task_state None.
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.090 182729 DEBUG nova.compute.manager [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received event network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.090 182729 DEBUG oslo_concurrency.lockutils [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.090 182729 DEBUG oslo_concurrency.lockutils [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.091 182729 DEBUG oslo_concurrency.lockutils [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f71aa702-00f6-400b-aa58-458e9e6d6b6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.091 182729 DEBUG nova.compute.manager [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] No waiting events found dispatching network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.091 182729 WARNING nova.compute.manager [req-7d8faf84-13a3-4a70-bc4c-52c4eee51465 req-f7a1a403-ac7e-4c01-b261-12c61a10a95b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Received unexpected event network-vif-plugged-4db284ba-8233-4db0-9bf5-367c86c67a4e for instance with vm_state deleted and task_state None.
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.883 182729 DEBUG nova.network.neutron [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updated VIF entry in instance network info cache for port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.883 182729 DEBUG nova.network.neutron [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updating instance_info_cache with network_info: [{"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:16 compute-0 nova_compute[182725]: 2026-01-22 22:39:16.910 182729 DEBUG oslo_concurrency.lockutils [req-d65fefda-999c-4f4f-9be9-60a41c0b7a70 req-8b2caf81-75a9-4db9-bdbf-70735a835a69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:17 compute-0 nova_compute[182725]: 2026-01-22 22:39:17.584 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:17 compute-0 nova_compute[182725]: 2026-01-22 22:39:17.768 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:18 compute-0 ovn_controller[94850]: 2026-01-22T22:39:18Z|00496|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 22:39:18 compute-0 ovn_controller[94850]: 2026-01-22T22:39:18Z|00497|binding|INFO|Releasing lport b9ddfa27-d43e-4650-8846-a57a89114a49 from this chassis (sb_readonly=0)
Jan 22 22:39:18 compute-0 nova_compute[182725]: 2026-01-22 22:39:18.498 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:18 compute-0 nova_compute[182725]: 2026-01-22 22:39:18.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:18 compute-0 nova_compute[182725]: 2026-01-22 22:39:18.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:18 compute-0 nova_compute[182725]: 2026-01-22 22:39:18.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:39:19 compute-0 nova_compute[182725]: 2026-01-22 22:39:19.168 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:21 compute-0 nova_compute[182725]: 2026-01-22 22:39:21.223 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:21 compute-0 ovn_controller[94850]: 2026-01-22T22:39:21Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:f6:e8 10.100.0.14
Jan 22 22:39:21 compute-0 ovn_controller[94850]: 2026-01-22T22:39:21Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:f6:e8 10.100.0.14
Jan 22 22:39:22 compute-0 nova_compute[182725]: 2026-01-22 22:39:22.771 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:22 compute-0 nova_compute[182725]: 2026-01-22 22:39:22.910 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:22 compute-0 nova_compute[182725]: 2026-01-22 22:39:22.939 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:22 compute-0 nova_compute[182725]: 2026-01-22 22:39:22.940 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:22 compute-0 nova_compute[182725]: 2026-01-22 22:39:22.940 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:22 compute-0 nova_compute[182725]: 2026-01-22 22:39:22.941 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.114 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.196 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.197 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.257 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.265 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.321 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.322 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.374 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.382 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.437 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.438 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.493 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.691 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.692 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5158MB free_disk=73.24839401245117GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.692 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.693 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:23 compute-0 ovn_controller[94850]: 2026-01-22T22:39:23Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:50:f1 10.100.0.10
Jan 22 22:39:23 compute-0 ovn_controller[94850]: 2026-01-22T22:39:23Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:50:f1 10.100.0.10
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.873 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 254e913f-3968-436b-afcc-e51c2350b232 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.873 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.874 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance e4a5cd94-28e4-4031-ae49-2527cbacc939 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.874 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:39:23 compute-0 nova_compute[182725]: 2026-01-22 22:39:23.875 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:39:24 compute-0 nova_compute[182725]: 2026-01-22 22:39:24.089 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:39:24 compute-0 nova_compute[182725]: 2026-01-22 22:39:24.113 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:39:24 compute-0 nova_compute[182725]: 2026-01-22 22:39:24.143 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:39:24 compute-0 nova_compute[182725]: 2026-01-22 22:39:24.144 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:24 compute-0 nova_compute[182725]: 2026-01-22 22:39:24.171 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:26 compute-0 nova_compute[182725]: 2026-01-22 22:39:26.122 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:26 compute-0 nova_compute[182725]: 2026-01-22 22:39:26.123 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:39:26 compute-0 nova_compute[182725]: 2026-01-22 22:39:26.151 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:39:26 compute-0 nova_compute[182725]: 2026-01-22 22:39:26.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:26 compute-0 nova_compute[182725]: 2026-01-22 22:39:26.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:26 compute-0 nova_compute[182725]: 2026-01-22 22:39:26.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:27 compute-0 podman[229823]: 2026-01-22 22:39:27.146541114 +0000 UTC m=+0.063696941 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:39:27 compute-0 nova_compute[182725]: 2026-01-22 22:39:27.574 182729 INFO nova.compute.manager [None req-0ab6f8de-8964-4ca5-b82a-dc0c5da84761 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Get console output
Jan 22 22:39:27 compute-0 nova_compute[182725]: 2026-01-22 22:39:27.580 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:39:27 compute-0 nova_compute[182725]: 2026-01-22 22:39:27.722 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121552.7213545, f71aa702-00f6-400b-aa58-458e9e6d6b6d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:27 compute-0 nova_compute[182725]: 2026-01-22 22:39:27.723 182729 INFO nova.compute.manager [-] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] VM Stopped (Lifecycle Event)
Jan 22 22:39:27 compute-0 nova_compute[182725]: 2026-01-22 22:39:27.775 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:27 compute-0 nova_compute[182725]: 2026-01-22 22:39:27.798 182729 DEBUG nova.compute.manager [None req-f0c9a65c-1021-4636-9404-656c3371e10b - - - - - -] [instance: f71aa702-00f6-400b-aa58-458e9e6d6b6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:28 compute-0 nova_compute[182725]: 2026-01-22 22:39:28.161 182729 DEBUG oslo_concurrency.lockutils [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:28 compute-0 nova_compute[182725]: 2026-01-22 22:39:28.161 182729 DEBUG oslo_concurrency.lockutils [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:28 compute-0 nova_compute[182725]: 2026-01-22 22:39:28.162 182729 INFO nova.compute.manager [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Rebooting instance
Jan 22 22:39:28 compute-0 nova_compute[182725]: 2026-01-22 22:39:28.203 182729 DEBUG oslo_concurrency.lockutils [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:28 compute-0 nova_compute[182725]: 2026-01-22 22:39:28.203 182729 DEBUG oslo_concurrency.lockutils [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:28 compute-0 nova_compute[182725]: 2026-01-22 22:39:28.204 182729 DEBUG nova.network.neutron [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:39:29 compute-0 nova_compute[182725]: 2026-01-22 22:39:29.173 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:30 compute-0 nova_compute[182725]: 2026-01-22 22:39:30.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:30 compute-0 nova_compute[182725]: 2026-01-22 22:39:30.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:39:31 compute-0 nova_compute[182725]: 2026-01-22 22:39:31.925 182729 DEBUG nova.network.neutron [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updating instance_info_cache with network_info: [{"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:31 compute-0 nova_compute[182725]: 2026-01-22 22:39:31.949 182729 DEBUG oslo_concurrency.lockutils [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:31 compute-0 nova_compute[182725]: 2026-01-22 22:39:31.978 182729 DEBUG nova.compute.manager [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:32 compute-0 podman[229845]: 2026-01-22 22:39:32.137963618 +0000 UTC m=+0.063034955 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Jan 22 22:39:32 compute-0 podman[229844]: 2026-01-22 22:39:32.204441347 +0000 UTC m=+0.125386271 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 22:39:32 compute-0 nova_compute[182725]: 2026-01-22 22:39:32.207 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:32 compute-0 nova_compute[182725]: 2026-01-22 22:39:32.255 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:32.255 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:32.257 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:39:32 compute-0 nova_compute[182725]: 2026-01-22 22:39:32.778 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:32 compute-0 nova_compute[182725]: 2026-01-22 22:39:32.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:33.260 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:33 compute-0 nova_compute[182725]: 2026-01-22 22:39:33.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.175 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 kernel: tap9a6dc28c-82 (unregistering): left promiscuous mode
Jan 22 22:39:34 compute-0 NetworkManager[54954]: <info>  [1769121574.4475] device (tap9a6dc28c-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:39:34 compute-0 ovn_controller[94850]: 2026-01-22T22:39:34Z|00498|binding|INFO|Releasing lport 9a6dc28c-828d-435c-b619-7c51693137c4 from this chassis (sb_readonly=0)
Jan 22 22:39:34 compute-0 ovn_controller[94850]: 2026-01-22T22:39:34Z|00499|binding|INFO|Setting lport 9a6dc28c-828d-435c-b619-7c51693137c4 down in Southbound
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.455 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 ovn_controller[94850]: 2026-01-22T22:39:34Z|00500|binding|INFO|Removing iface tap9a6dc28c-82 ovn-installed in OVS
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.468 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:f6:e8 10.100.0.14'], port_security=['fa:16:3e:2d:f6:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1f30876-5cd8-4e98-8834-4f70a79dfb84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.223'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec375847-f056-44d3-9d58-90e08d020586, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9a6dc28c-828d-435c-b619-7c51693137c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.469 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.470 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9a6dc28c-828d-435c-b619-7c51693137c4 in datapath 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 unbound from our chassis
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.472 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.473 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1bb4a6-7a42-4168-ba6b-f4f96ed4378a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.474 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 namespace which is not needed anymore
Jan 22 22:39:34 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 22 22:39:34 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000080.scope: Consumed 12.846s CPU time.
Jan 22 22:39:34 compute-0 systemd-machined[154006]: Machine qemu-58-instance-00000080 terminated.
Jan 22 22:39:34 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[229617]: [NOTICE]   (229621) : haproxy version is 2.8.14-c23fe91
Jan 22 22:39:34 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[229617]: [NOTICE]   (229621) : path to executable is /usr/sbin/haproxy
Jan 22 22:39:34 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[229617]: [WARNING]  (229621) : Exiting Master process...
Jan 22 22:39:34 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[229617]: [ALERT]    (229621) : Current worker (229623) exited with code 143 (Terminated)
Jan 22 22:39:34 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[229617]: [WARNING]  (229621) : All workers exited. Exiting... (0)
Jan 22 22:39:34 compute-0 systemd[1]: libpod-43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0.scope: Deactivated successfully.
Jan 22 22:39:34 compute-0 podman[229919]: 2026-01-22 22:39:34.594279131 +0000 UTC m=+0.041862846 container died 43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:39:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0-userdata-shm.mount: Deactivated successfully.
Jan 22 22:39:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-a30a713fece18f4e45710de58c602e4dace33a2f92854cdbb64a5ce891d4d33a-merged.mount: Deactivated successfully.
Jan 22 22:39:34 compute-0 podman[229919]: 2026-01-22 22:39:34.629270954 +0000 UTC m=+0.076854669 container cleanup 43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:39:34 compute-0 systemd[1]: libpod-conmon-43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0.scope: Deactivated successfully.
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.683 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 podman[229950]: 2026-01-22 22:39:34.686551084 +0000 UTC m=+0.038251916 container remove 43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.689 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.691 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[981a45f8-1fad-4334-904f-a10ec24a9f26]: (4, ('Thu Jan 22 10:39:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 (43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0)\n43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0\nThu Jan 22 10:39:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 (43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0)\n43ced917661a2244dc6a08f70b5df4d4ceb35398aa83e7eec6f93bb1dc60b0f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.693 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1a0e0a-6945-4ca2-92d4-877408a42e98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.694 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a53a2bd-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.696 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 kernel: tap7a53a2bd-50: left promiscuous mode
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.712 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 nova_compute[182725]: 2026-01-22 22:39:34.714 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.716 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6af3211e-0e4f-4c00-bb60-ce613d4a8718]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.732 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9c205969-f9b3-411c-969c-eac8f226e3bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.733 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e76d247f-4835-483f-834d-185b4605564a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.747 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7d0dfa7a-9c54-428c-88e1-875093c37ca8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518685, 'reachable_time': 29958, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229983, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.749 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:39:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:34.749 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[66321e19-2d99-47d7-bf5c-6054d3298d4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d7a53a2bd\x2d56e7\x2d4238\x2da2c4\x2d69b2eaef9fb9.mount: Deactivated successfully.
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.209 182729 DEBUG nova.compute.manager [req-ed4cc49c-3320-45fc-951f-6a5b1d76cc34 req-7690b7f2-7538-45ee-a186-f3b35a30e152 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-unplugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.210 182729 DEBUG oslo_concurrency.lockutils [req-ed4cc49c-3320-45fc-951f-6a5b1d76cc34 req-7690b7f2-7538-45ee-a186-f3b35a30e152 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.210 182729 DEBUG oslo_concurrency.lockutils [req-ed4cc49c-3320-45fc-951f-6a5b1d76cc34 req-7690b7f2-7538-45ee-a186-f3b35a30e152 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.211 182729 DEBUG oslo_concurrency.lockutils [req-ed4cc49c-3320-45fc-951f-6a5b1d76cc34 req-7690b7f2-7538-45ee-a186-f3b35a30e152 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.211 182729 DEBUG nova.compute.manager [req-ed4cc49c-3320-45fc-951f-6a5b1d76cc34 req-7690b7f2-7538-45ee-a186-f3b35a30e152 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] No waiting events found dispatching network-vif-unplugged-9a6dc28c-828d-435c-b619-7c51693137c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.211 182729 WARNING nova.compute.manager [req-ed4cc49c-3320-45fc-951f-6a5b1d76cc34 req-7690b7f2-7538-45ee-a186-f3b35a30e152 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received unexpected event network-vif-unplugged-9a6dc28c-828d-435c-b619-7c51693137c4 for instance with vm_state active and task_state reboot_started.
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.252 182729 INFO nova.virt.libvirt.driver [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance shutdown successfully.
Jan 22 22:39:35 compute-0 kernel: tap9a6dc28c-82: entered promiscuous mode
Jan 22 22:39:35 compute-0 NetworkManager[54954]: <info>  [1769121575.2995] manager: (tap9a6dc28c-82): new Tun device (/org/freedesktop/NetworkManager/Devices/233)
Jan 22 22:39:35 compute-0 systemd-udevd[229898]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:39:35 compute-0 ovn_controller[94850]: 2026-01-22T22:39:35Z|00501|binding|INFO|Claiming lport 9a6dc28c-828d-435c-b619-7c51693137c4 for this chassis.
Jan 22 22:39:35 compute-0 ovn_controller[94850]: 2026-01-22T22:39:35Z|00502|binding|INFO|9a6dc28c-828d-435c-b619-7c51693137c4: Claiming fa:16:3e:2d:f6:e8 10.100.0.14
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.302 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.311 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:f6:e8 10.100.0.14'], port_security=['fa:16:3e:2d:f6:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e1f30876-5cd8-4e98-8834-4f70a79dfb84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.223'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec375847-f056-44d3-9d58-90e08d020586, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9a6dc28c-828d-435c-b619-7c51693137c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.312 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9a6dc28c-828d-435c-b619-7c51693137c4 in datapath 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 bound to our chassis
Jan 22 22:39:35 compute-0 ovn_controller[94850]: 2026-01-22T22:39:35Z|00503|binding|INFO|Setting lport 9a6dc28c-828d-435c-b619-7c51693137c4 ovn-installed in OVS
Jan 22 22:39:35 compute-0 ovn_controller[94850]: 2026-01-22T22:39:35Z|00504|binding|INFO|Setting lport 9a6dc28c-828d-435c-b619-7c51693137c4 up in Southbound
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.315 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.315 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:35 compute-0 NetworkManager[54954]: <info>  [1769121575.3189] device (tap9a6dc28c-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:39:35 compute-0 NetworkManager[54954]: <info>  [1769121575.3193] device (tap9a6dc28c-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.325 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[760fbd47-d97f-4317-b617-cd4339a2377c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.326 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7a53a2bd-51 in ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.328 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7a53a2bd-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.329 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d915ccea-13d0-45e5-90c9-0e944af67018]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.329 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[baee11a9-d316-4202-b2cb-dbfdf2b77f95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.341 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[876574e6-bfba-410c-927f-bbc70068981b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 systemd-machined[154006]: New machine qemu-59-instance-00000080.
Jan 22 22:39:35 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-00000080.
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.355 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a0ae78-ec6e-4b0a-ab4e-0e783d7ed113]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.384 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e77413e4-4aea-4b0e-ac2b-57de18715173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 NetworkManager[54954]: <info>  [1769121575.3913] manager: (tap7a53a2bd-50): new Veth device (/org/freedesktop/NetworkManager/Devices/234)
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.391 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[27ec57ee-e997-468a-a55e-7ce069853b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.424 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7559b0b7-abf1-43c9-952b-441a3a44b620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.427 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[026502b3-ac56-4b4b-b044-5d535d760a6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 NetworkManager[54954]: <info>  [1769121575.4474] device (tap7a53a2bd-50): carrier: link connected
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.453 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[2e419c0a-5a3f-491a-84f1-62d0c90e689b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.468 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[da727e89-b305-497e-ac54-0ae38e594545]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a53a2bd-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:68:a6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521305, 'reachable_time': 26743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230030, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.484 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b78568-b2fd-4c2a-b578-a2e8accf6d5a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe60:68a6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521305, 'tstamp': 521305}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230031, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.497 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f6577029-d2a1-40c1-a7c7-3a133b577dc8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a53a2bd-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:68:a6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521305, 'reachable_time': 26743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230032, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.523 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[01455d5c-cce6-49e4-bc68-144022409b8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.578 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ed93baf7-b172-4176-a07e-adc829c5c0ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.580 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a53a2bd-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.580 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.580 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a53a2bd-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:35 compute-0 NetworkManager[54954]: <info>  [1769121575.5827] manager: (tap7a53a2bd-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.582 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:35 compute-0 kernel: tap7a53a2bd-50: entered promiscuous mode
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.584 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.585 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a53a2bd-50, col_values=(('external_ids', {'iface-id': 'b9ddfa27-d43e-4650-8846-a57a89114a49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.586 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:35 compute-0 ovn_controller[94850]: 2026-01-22T22:39:35Z|00505|binding|INFO|Releasing lport b9ddfa27-d43e-4650-8846-a57a89114a49 from this chassis (sb_readonly=0)
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.587 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.588 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.589 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b9514d8a-aef7-4621-ba46-2684630137f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.590 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.pid.haproxy
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:39:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:35.591 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'env', 'PROCESS_TAG=haproxy-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.602 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.710 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.710 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121575.7097461, dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.711 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] VM Resumed (Lifecycle Event)
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.742 182729 INFO nova.virt.libvirt.driver [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance running successfully.
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.742 182729 INFO nova.virt.libvirt.driver [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance soft rebooted successfully.
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.743 182729 DEBUG nova.compute.manager [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.761 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.765 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.793 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] During sync_power_state the instance has a pending task (reboot_started). Skip.
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.793 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121575.7390969, dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.793 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] VM Started (Lifecycle Event)
Jan 22 22:39:35 compute-0 nova_compute[182725]: 2026-01-22 22:39:35.907 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:35 compute-0 podman[230071]: 2026-01-22 22:39:35.936369131 +0000 UTC m=+0.044366738 container create d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:39:35 compute-0 systemd[1]: Started libpod-conmon-d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f.scope.
Jan 22 22:39:36 compute-0 podman[230071]: 2026-01-22 22:39:35.912433114 +0000 UTC m=+0.020430741 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:39:36 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:39:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c48b7242c21f99be5c6a07a9e026e2561a8632bc6f9266c64ef22d64370c3e8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:39:36 compute-0 podman[230071]: 2026-01-22 22:39:36.036932921 +0000 UTC m=+0.144930578 container init d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:39:36 compute-0 nova_compute[182725]: 2026-01-22 22:39:36.037 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:36 compute-0 nova_compute[182725]: 2026-01-22 22:39:36.041 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:36 compute-0 podman[230071]: 2026-01-22 22:39:36.04247938 +0000 UTC m=+0.150476997 container start d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:39:36 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [NOTICE]   (230090) : New worker (230092) forked
Jan 22 22:39:36 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [NOTICE]   (230090) : Loading success.
Jan 22 22:39:36 compute-0 nova_compute[182725]: 2026-01-22 22:39:36.088 182729 DEBUG oslo_concurrency.lockutils [None req-1cff9783-56bc-4a0e-97a0-50aee695eaef 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.781 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.963 182729 DEBUG nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.963 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.963 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.963 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.964 182729 DEBUG nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] No waiting events found dispatching network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.964 182729 WARNING nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received unexpected event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 for instance with vm_state active and task_state None.
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.964 182729 DEBUG nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.964 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.964 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.964 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.965 182729 DEBUG nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] No waiting events found dispatching network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.965 182729 WARNING nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received unexpected event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 for instance with vm_state active and task_state None.
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.965 182729 DEBUG nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.965 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.965 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.965 182729 DEBUG oslo_concurrency.lockutils [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.965 182729 DEBUG nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] No waiting events found dispatching network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:37 compute-0 nova_compute[182725]: 2026-01-22 22:39:37.966 182729 WARNING nova.compute.manager [req-0838ecce-4e33-4a78-9c7b-da5441dba86d req-3e036aa8-bbdf-459f-ab83-c36458563e67 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received unexpected event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 for instance with vm_state active and task_state None.
Jan 22 22:39:39 compute-0 nova_compute[182725]: 2026-01-22 22:39:39.177 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:40 compute-0 nova_compute[182725]: 2026-01-22 22:39:40.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:42 compute-0 podman[230103]: 2026-01-22 22:39:42.148403271 +0000 UTC m=+0.070675695 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:39:42 compute-0 podman[230104]: 2026-01-22 22:39:42.152949035 +0000 UTC m=+0.074850990 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:39:42 compute-0 podman[230102]: 2026-01-22 22:39:42.163104288 +0000 UTC m=+0.081734781 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:39:42 compute-0 nova_compute[182725]: 2026-01-22 22:39:42.786 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:44 compute-0 nova_compute[182725]: 2026-01-22 22:39:44.181 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.114 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.115 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.141 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.284 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.286 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.299 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.300 182729 INFO nova.compute.claims [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.696 182729 DEBUG nova.compute.provider_tree [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.733 182729 DEBUG nova.scheduler.client.report [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.756 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.757 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.853 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.854 182729 DEBUG nova.network.neutron [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.881 182729 INFO nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:39:45 compute-0 nova_compute[182725]: 2026-01-22 22:39:45.904 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.033 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.034 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.054 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.103 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.105 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.105 182729 INFO nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Creating image(s)
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.106 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.106 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.107 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.118 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.191 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.192 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.193 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.203 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.220 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.220 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.223 182729 DEBUG nova.policy [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.231 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.232 182729 INFO nova.compute.claims [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.260 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.261 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.321 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.322 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.323 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.391 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.392 182729 DEBUG nova.virt.disk.api [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.393 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.447 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.449 182729 DEBUG nova.virt.disk.api [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.450 182729 DEBUG nova.objects.instance [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.494 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.495 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Ensure instance console log exists: /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.495 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.496 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.496 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.588 182729 DEBUG nova.compute.provider_tree [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.622 182729 DEBUG nova.scheduler.client.report [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.643 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.644 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.725 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.728 182729 DEBUG nova.network.neutron [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.774 182729 INFO nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.799 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.951 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.953 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.954 182729 INFO nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Creating image(s)
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.955 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "/var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.956 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "/var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.957 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "/var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:46 compute-0 nova_compute[182725]: 2026-01-22 22:39:46.981 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.050 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.052 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.053 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.081 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.107 182729 DEBUG nova.policy [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.142 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.144 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.232 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk 1073741824" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.234 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.235 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.339 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.341 182729 DEBUG nova.virt.disk.api [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Checking if we can resize image /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.342 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.405 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.407 182729 DEBUG nova.virt.disk.api [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Cannot resize image /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.408 182729 DEBUG nova.objects.instance [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'migration_context' on Instance uuid 9273e299-fee4-42e3-a2d9-f8b355cc5cfe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.431 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.432 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Ensure instance console log exists: /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.433 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.434 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.434 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:47 compute-0 nova_compute[182725]: 2026-01-22 22:39:47.790 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:47 compute-0 ovn_controller[94850]: 2026-01-22T22:39:47Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:f6:e8 10.100.0.14
Jan 22 22:39:48 compute-0 nova_compute[182725]: 2026-01-22 22:39:48.452 182729 DEBUG nova.network.neutron [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Successfully created port: 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:39:48 compute-0 nova_compute[182725]: 2026-01-22 22:39:48.675 182729 DEBUG nova.network.neutron [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Successfully created port: 427a76c6-9759-412b-9f78-6d0e033fa0c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:39:49 compute-0 nova_compute[182725]: 2026-01-22 22:39:49.183 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:49 compute-0 nova_compute[182725]: 2026-01-22 22:39:49.936 182729 DEBUG nova.network.neutron [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Successfully updated port: 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:39:49 compute-0 nova_compute[182725]: 2026-01-22 22:39:49.958 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:49 compute-0 nova_compute[182725]: 2026-01-22 22:39:49.959 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:49 compute-0 nova_compute[182725]: 2026-01-22 22:39:49.959 182729 DEBUG nova.network.neutron [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:39:50 compute-0 nova_compute[182725]: 2026-01-22 22:39:50.066 182729 DEBUG nova.compute.manager [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-changed-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:50 compute-0 nova_compute[182725]: 2026-01-22 22:39:50.066 182729 DEBUG nova.compute.manager [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing instance network info cache due to event network-changed-22e0ead7-6f30-4530-8c7a-18ca9aeeab12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:50 compute-0 nova_compute[182725]: 2026-01-22 22:39:50.066 182729 DEBUG oslo_concurrency.lockutils [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:50 compute-0 nova_compute[182725]: 2026-01-22 22:39:50.935 182729 DEBUG nova.network.neutron [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:39:52 compute-0 sshd-session[230203]: Connection closed by 173.255.228.234 port 58866
Jan 22 22:39:52 compute-0 nova_compute[182725]: 2026-01-22 22:39:52.794 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:52 compute-0 nova_compute[182725]: 2026-01-22 22:39:52.878 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:52 compute-0 nova_compute[182725]: 2026-01-22 22:39:52.964 182729 DEBUG nova.network.neutron [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Successfully updated port: 427a76c6-9759-412b-9f78-6d0e033fa0c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:39:52 compute-0 nova_compute[182725]: 2026-01-22 22:39:52.984 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "refresh_cache-9273e299-fee4-42e3-a2d9-f8b355cc5cfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:52 compute-0 nova_compute[182725]: 2026-01-22 22:39:52.984 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquired lock "refresh_cache-9273e299-fee4-42e3-a2d9-f8b355cc5cfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:52 compute-0 nova_compute[182725]: 2026-01-22 22:39:52.985 182729 DEBUG nova.network.neutron [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:39:53 compute-0 nova_compute[182725]: 2026-01-22 22:39:53.118 182729 DEBUG nova.compute.manager [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received event network-changed-427a76c6-9759-412b-9f78-6d0e033fa0c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:53 compute-0 nova_compute[182725]: 2026-01-22 22:39:53.119 182729 DEBUG nova.compute.manager [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Refreshing instance network info cache due to event network-changed-427a76c6-9759-412b-9f78-6d0e033fa0c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:53 compute-0 nova_compute[182725]: 2026-01-22 22:39:53.119 182729 DEBUG oslo_concurrency.lockutils [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-9273e299-fee4-42e3-a2d9-f8b355cc5cfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:53 compute-0 nova_compute[182725]: 2026-01-22 22:39:53.912 182729 DEBUG nova.network.neutron [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:39:54 compute-0 nova_compute[182725]: 2026-01-22 22:39:54.185 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:54 compute-0 nova_compute[182725]: 2026-01-22 22:39:54.704 182729 INFO nova.compute.manager [None req-9dbbbf0a-8e4c-4ae3-ab27-37b58f83080a 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Get console output
Jan 22 22:39:54 compute-0 nova_compute[182725]: 2026-01-22 22:39:54.710 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:39:54 compute-0 nova_compute[182725]: 2026-01-22 22:39:54.976 182729 DEBUG nova.network.neutron [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.005 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.006 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Instance network_info: |[{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.006 182729 DEBUG oslo_concurrency.lockutils [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.007 182729 DEBUG nova.network.neutron [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing network info cache for port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.011 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Start _get_guest_xml network_info=[{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.017 182729 WARNING nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.024 182729 DEBUG nova.virt.libvirt.host [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.025 182729 DEBUG nova.virt.libvirt.host [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.027 182729 DEBUG nova.virt.libvirt.host [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.027 182729 DEBUG nova.virt.libvirt.host [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.028 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.029 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.029 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.030 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.030 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.030 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.030 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.031 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.031 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.031 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.031 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.032 182729 DEBUG nova.virt.hardware [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.035 182729 DEBUG nova.virt.libvirt.vif [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:45Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.036 182729 DEBUG nova.network.os_vif_util [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.037 182729 DEBUG nova.network.os_vif_util [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:c2:50,bridge_name='br-int',has_traffic_filtering=True,id=22e0ead7-6f30-4530-8c7a-18ca9aeeab12,network=Network(fbb96457-41f3-4931-8421-59ae568f6512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e0ead7-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.039 182729 DEBUG nova.objects.instance [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.056 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <uuid>e6a1471a-80f0-43ff-95e0-b865b6134ab6</uuid>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <name>instance-00000084</name>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:39:55</nova:creationTime>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:39:55 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <system>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="serial">e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="uuid">e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </system>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <os>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </os>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <features>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </features>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:52:c2:50"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <target dev="tap22e0ead7-6f"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log" append="off"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <video>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </video>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:39:55 compute-0 nova_compute[182725]: </domain>
Jan 22 22:39:55 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.057 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Preparing to wait for external event network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.058 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.058 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.059 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.060 182729 DEBUG nova.virt.libvirt.vif [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:45Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.060 182729 DEBUG nova.network.os_vif_util [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.061 182729 DEBUG nova.network.os_vif_util [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:c2:50,bridge_name='br-int',has_traffic_filtering=True,id=22e0ead7-6f30-4530-8c7a-18ca9aeeab12,network=Network(fbb96457-41f3-4931-8421-59ae568f6512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e0ead7-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.062 182729 DEBUG os_vif [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:c2:50,bridge_name='br-int',has_traffic_filtering=True,id=22e0ead7-6f30-4530-8c7a-18ca9aeeab12,network=Network(fbb96457-41f3-4931-8421-59ae568f6512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e0ead7-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.063 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.063 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.064 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.067 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.068 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e0ead7-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.068 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap22e0ead7-6f, col_values=(('external_ids', {'iface-id': '22e0ead7-6f30-4530-8c7a-18ca9aeeab12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:c2:50', 'vm-uuid': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:55 compute-0 NetworkManager[54954]: <info>  [1769121595.0720] manager: (tap22e0ead7-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.073 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.081 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.083 182729 INFO os_vif [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:c2:50,bridge_name='br-int',has_traffic_filtering=True,id=22e0ead7-6f30-4530-8c7a-18ca9aeeab12,network=Network(fbb96457-41f3-4931-8421-59ae568f6512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e0ead7-6f')
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.171 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.171 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.172 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:52:c2:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.172 182729 INFO nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Using config drive
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.846 182729 DEBUG nova.network.neutron [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Updating instance_info_cache with network_info: [{"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.882 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Releasing lock "refresh_cache-9273e299-fee4-42e3-a2d9-f8b355cc5cfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.882 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Instance network_info: |[{"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.882 182729 DEBUG oslo_concurrency.lockutils [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-9273e299-fee4-42e3-a2d9-f8b355cc5cfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.883 182729 DEBUG nova.network.neutron [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Refreshing network info cache for port 427a76c6-9759-412b-9f78-6d0e033fa0c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.885 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Start _get_guest_xml network_info=[{"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.890 182729 WARNING nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.894 182729 DEBUG nova.virt.libvirt.host [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.895 182729 DEBUG nova.virt.libvirt.host [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.898 182729 DEBUG nova.virt.libvirt.host [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.899 182729 DEBUG nova.virt.libvirt.host [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.900 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.900 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.900 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.900 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.901 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.901 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.901 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.901 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.902 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.902 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.902 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.902 182729 DEBUG nova.virt.hardware [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.905 182729 DEBUG nova.virt.libvirt.vif [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-810446562',display_name='tempest-ServersNegativeTestJSON-server-810446562',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-810446562',id=133,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5906f64d8ee84f068ff9caa68ae3652b',ramdisk_id='',reservation_id='r-tnoa5gq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-2095273166',owner_user_name='tempest-ServersNegativeTestJSON-2095273166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:46Z,user_data=None,user_id='45cd11974e6648e1872fb5ebf9dee0b1',uuid=9273e299-fee4-42e3-a2d9-f8b355cc5cfe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.906 182729 DEBUG nova.network.os_vif_util [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converting VIF {"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.906 182729 DEBUG nova.network.os_vif_util [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:e4:66,bridge_name='br-int',has_traffic_filtering=True,id=427a76c6-9759-412b-9f78-6d0e033fa0c9,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap427a76c6-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.907 182729 DEBUG nova.objects.instance [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'pci_devices' on Instance uuid 9273e299-fee4-42e3-a2d9-f8b355cc5cfe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.915 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.916 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.916 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.916 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.916 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.926 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <uuid>9273e299-fee4-42e3-a2d9-f8b355cc5cfe</uuid>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <name>instance-00000085</name>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:name>tempest-ServersNegativeTestJSON-server-810446562</nova:name>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:39:55</nova:creationTime>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:user uuid="45cd11974e6648e1872fb5ebf9dee0b1">tempest-ServersNegativeTestJSON-2095273166-project-member</nova:user>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:project uuid="5906f64d8ee84f068ff9caa68ae3652b">tempest-ServersNegativeTestJSON-2095273166</nova:project>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         <nova:port uuid="427a76c6-9759-412b-9f78-6d0e033fa0c9">
Jan 22 22:39:55 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <system>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="serial">9273e299-fee4-42e3-a2d9-f8b355cc5cfe</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="uuid">9273e299-fee4-42e3-a2d9-f8b355cc5cfe</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </system>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <os>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </os>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <features>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </features>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk.config"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:58:e4:66"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <target dev="tap427a76c6-97"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/console.log" append="off"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <video>
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </video>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:39:55 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:39:55 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:39:55 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:39:55 compute-0 nova_compute[182725]: </domain>
Jan 22 22:39:55 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.927 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Preparing to wait for external event network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.927 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.927 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.928 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.928 182729 DEBUG nova.virt.libvirt.vif [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-810446562',display_name='tempest-ServersNegativeTestJSON-server-810446562',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-810446562',id=133,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5906f64d8ee84f068ff9caa68ae3652b',ramdisk_id='',reservation_id='r-tnoa5gq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-2095273166',owner_user_name='tempest-ServersNegativeTestJSON-2095273166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:46Z,user_data=None,user_id='45cd11974e6648e1872fb5ebf9dee0b1',uuid=9273e299-fee4-42e3-a2d9-f8b355cc5cfe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.929 182729 DEBUG nova.network.os_vif_util [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converting VIF {"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.929 182729 DEBUG nova.network.os_vif_util [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:e4:66,bridge_name='br-int',has_traffic_filtering=True,id=427a76c6-9759-412b-9f78-6d0e033fa0c9,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap427a76c6-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.930 182729 DEBUG os_vif [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:e4:66,bridge_name='br-int',has_traffic_filtering=True,id=427a76c6-9759-412b-9f78-6d0e033fa0c9,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap427a76c6-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.930 182729 INFO nova.compute.manager [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Terminating instance
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.932 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.932 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.932 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.938 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap427a76c6-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.939 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap427a76c6-97, col_values=(('external_ids', {'iface-id': '427a76c6-9759-412b-9f78-6d0e033fa0c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:e4:66', 'vm-uuid': '9273e299-fee4-42e3-a2d9-f8b355cc5cfe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.941 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:55 compute-0 NetworkManager[54954]: <info>  [1769121595.9419] manager: (tap427a76c6-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.944 182729 DEBUG nova.compute.manager [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.948 182729 INFO nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Creating config drive at /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.955 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo8ab4i3n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.977 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.980 182729 INFO os_vif [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:e4:66,bridge_name='br-int',has_traffic_filtering=True,id=427a76c6-9759-412b-9f78-6d0e033fa0c9,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap427a76c6-97')
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.986 182729 DEBUG nova.compute.manager [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-changed-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.986 182729 DEBUG nova.compute.manager [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Refreshing instance network info cache due to event network-changed-9a6dc28c-828d-435c-b619-7c51693137c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.987 182729 DEBUG oslo_concurrency.lockutils [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.987 182729 DEBUG oslo_concurrency.lockutils [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:39:55 compute-0 nova_compute[182725]: 2026-01-22 22:39:55.987 182729 DEBUG nova.network.neutron [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Refreshing network info cache for port 9a6dc28c-828d-435c-b619-7c51693137c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:39:56 compute-0 kernel: tap9a6dc28c-82 (unregistering): left promiscuous mode
Jan 22 22:39:56 compute-0 NetworkManager[54954]: <info>  [1769121596.0098] device (tap9a6dc28c-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:39:56 compute-0 ovn_controller[94850]: 2026-01-22T22:39:56Z|00506|binding|INFO|Releasing lport 9a6dc28c-828d-435c-b619-7c51693137c4 from this chassis (sb_readonly=0)
Jan 22 22:39:56 compute-0 ovn_controller[94850]: 2026-01-22T22:39:56Z|00507|binding|INFO|Setting lport 9a6dc28c-828d-435c-b619-7c51693137c4 down in Southbound
Jan 22 22:39:56 compute-0 ovn_controller[94850]: 2026-01-22T22:39:56Z|00508|binding|INFO|Removing iface tap9a6dc28c-82 ovn-installed in OVS
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.031 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.034 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:f6:e8 10.100.0.14'], port_security=['fa:16:3e:2d:f6:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e1f30876-5cd8-4e98-8834-4f70a79dfb84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec375847-f056-44d3-9d58-90e08d020586, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9a6dc28c-828d-435c-b619-7c51693137c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.036 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9a6dc28c-828d-435c-b619-7c51693137c4 in datapath 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 unbound from our chassis
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.038 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.039 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.039 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf92d41-769b-4002-a753-89421963e149]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.041 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 namespace which is not needed anymore
Jan 22 22:39:56 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.061 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.062 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:39:56 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000080.scope: Consumed 12.442s CPU time.
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.062 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] No VIF found with MAC fa:16:3e:58:e4:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.062 182729 INFO nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Using config drive
Jan 22 22:39:56 compute-0 systemd-machined[154006]: Machine qemu-59-instance-00000080 terminated.
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.089 182729 DEBUG oslo_concurrency.processutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo8ab4i3n" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:56 compute-0 kernel: tap22e0ead7-6f: entered promiscuous mode
Jan 22 22:39:56 compute-0 NetworkManager[54954]: <info>  [1769121596.1779] manager: (tap22e0ead7-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 22 22:39:56 compute-0 systemd-udevd[230217]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:39:56 compute-0 ovn_controller[94850]: 2026-01-22T22:39:56Z|00509|binding|INFO|Claiming lport 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 for this chassis.
Jan 22 22:39:56 compute-0 ovn_controller[94850]: 2026-01-22T22:39:56Z|00510|binding|INFO|22e0ead7-6f30-4530-8c7a-18ca9aeeab12: Claiming fa:16:3e:52:c2:50 10.100.0.10
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.187 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 NetworkManager[54954]: <info>  [1769121596.2002] device (tap22e0ead7-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:39:56 compute-0 NetworkManager[54954]: <info>  [1769121596.2011] device (tap22e0ead7-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:39:56 compute-0 ovn_controller[94850]: 2026-01-22T22:39:56Z|00511|binding|INFO|Setting lport 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 ovn-installed in OVS
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.205 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 ovn_controller[94850]: 2026-01-22T22:39:56Z|00512|binding|INFO|Setting lport 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 up in Southbound
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.210 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:c2:50 10.100.0.10'], port_security=['fa:16:3e:52:c2:50 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbb96457-41f3-4931-8421-59ae568f6512', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7df5b530-2858-4af9-8ee2-0c5e2e8071be', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b987ad2-bd3d-4a80-a6eb-b548d3af0bc7, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=22e0ead7-6f30-4530-8c7a-18ca9aeeab12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.213 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 NetworkManager[54954]: <info>  [1769121596.2262] manager: (tap9a6dc28c-82): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.233 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 systemd-machined[154006]: New machine qemu-60-instance-00000084.
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.246 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-00000084.
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.283 182729 INFO nova.virt.libvirt.driver [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Instance destroyed successfully.
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.284 182729 DEBUG nova.objects.instance [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.297 182729 DEBUG nova.virt.libvirt.vif [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-920730235',display_name='tempest-TestNetworkAdvancedServerOps-server-920730235',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-920730235',id=128,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKipAECYwv/kwfCJ76CvMOtJresIZaDLlyUkiFxcTlYjbAX4511FUkSKueMkA0cQfc3M7mxwBESGCBPari0ZsqICW2HR5vgwqLD0i+BZnu1BuFVZ3D4TfH+oikr45N3Ctg==',key_name='tempest-TestNetworkAdvancedServerOps-822496345',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-dma0x6tw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:35Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.298 182729 DEBUG nova.network.os_vif_util [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.298 182729 DEBUG nova.network.os_vif_util [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:f6:e8,bridge_name='br-int',has_traffic_filtering=True,id=9a6dc28c-828d-435c-b619-7c51693137c4,network=Network(7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a6dc28c-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.299 182729 DEBUG os_vif [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:f6:e8,bridge_name='br-int',has_traffic_filtering=True,id=9a6dc28c-828d-435c-b619-7c51693137c4,network=Network(7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a6dc28c-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.301 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.301 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a6dc28c-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.305 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.307 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.308 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.310 182729 INFO os_vif [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:f6:e8,bridge_name='br-int',has_traffic_filtering=True,id=9a6dc28c-828d-435c-b619-7c51693137c4,network=Network(7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a6dc28c-82')
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.311 182729 INFO nova.virt.libvirt.driver [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Deleting instance files /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f_del
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.312 182729 INFO nova.virt.libvirt.driver [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Deletion of /var/lib/nova/instances/dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f_del complete
Jan 22 22:39:56 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [NOTICE]   (230090) : haproxy version is 2.8.14-c23fe91
Jan 22 22:39:56 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [NOTICE]   (230090) : path to executable is /usr/sbin/haproxy
Jan 22 22:39:56 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [WARNING]  (230090) : Exiting Master process...
Jan 22 22:39:56 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [WARNING]  (230090) : Exiting Master process...
Jan 22 22:39:56 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [ALERT]    (230090) : Current worker (230092) exited with code 143 (Terminated)
Jan 22 22:39:56 compute-0 neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9[230086]: [WARNING]  (230090) : All workers exited. Exiting... (0)
Jan 22 22:39:56 compute-0 systemd[1]: libpod-d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f.scope: Deactivated successfully.
Jan 22 22:39:56 compute-0 podman[230244]: 2026-01-22 22:39:56.344143116 +0000 UTC m=+0.195188343 container died d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:39:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f-userdata-shm.mount: Deactivated successfully.
Jan 22 22:39:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-c48b7242c21f99be5c6a07a9e026e2561a8632bc6f9266c64ef22d64370c3e8a-merged.mount: Deactivated successfully.
Jan 22 22:39:56 compute-0 podman[230244]: 2026-01-22 22:39:56.394625846 +0000 UTC m=+0.245671083 container cleanup d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:39:56 compute-0 systemd[1]: libpod-conmon-d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f.scope: Deactivated successfully.
Jan 22 22:39:56 compute-0 podman[230311]: 2026-01-22 22:39:56.474754556 +0000 UTC m=+0.048753528 container remove d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.484 182729 INFO nova.compute.manager [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Took 0.54 seconds to destroy the instance on the hypervisor.
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.484 182729 DEBUG oslo.service.loopingcall [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.485 182729 DEBUG nova.compute.manager [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.485 182729 DEBUG nova.network.neutron [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.490 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[92bb4e8f-eced-4862-99b3-e6e1b06eceaf]: (4, ('Thu Jan 22 10:39:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 (d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f)\nd3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f\nThu Jan 22 10:39:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 (d3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f)\nd3159cae0ba47880c029c86ff22cb7b94f5a2d1d4341ae03c951fc400a62af9f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.492 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e259846d-b678-4e28-be2a-45549425ae22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.493 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a53a2bd-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.495 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 kernel: tap7a53a2bd-50: left promiscuous mode
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.523 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.523 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5fec9a-7bb2-4e17-9e97-5c2ec0a5b3f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.544 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e02e9234-1766-4763-ae15-c6e1db7b2277]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.545 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[230d8e52-e42b-42b3-a21c-f2e4b31c67b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.568 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[778492e0-c8b1-4e3b-af52-5f95879ebeb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521298, 'reachable_time': 33057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230331, 'error': None, 'target': 'ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.572 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.573 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd67f8f-5c81-49a7-9e8f-6cc62363eab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d7a53a2bd\x2d56e7\x2d4238\x2da2c4\x2d69b2eaef9fb9.mount: Deactivated successfully.
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.575 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 in datapath fbb96457-41f3-4931-8421-59ae568f6512 unbound from our chassis
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.577 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbb96457-41f3-4931-8421-59ae568f6512
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.592 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[96925ab1-767d-4755-9eaa-e1e24120b4a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.593 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbb96457-41 in ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.596 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbb96457-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.596 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1209f4-04f3-4e39-8790-3592e0f2d33b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.597 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d88ff9-a46a-4ac0-8a00-a1a7bfb13a8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.612 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4846349b-d2d4-4080-aba6-e28c25edb165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.631 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7436da0c-fd5a-4821-9a11-571b23a22165]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.670 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e7dde95e-804c-45fb-bfe4-cf4dbba128bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.676 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[249b485c-7dbb-4001-9d1b-56d35c262e96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 NetworkManager[54954]: <info>  [1769121596.6804] manager: (tapfbb96457-40): new Veth device (/org/freedesktop/NetworkManager/Devices/240)
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.719 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f243f507-32d8-48cf-b867-13bdcd3f9537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.723 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4e445262-9f8b-4779-9988-b472fad400da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 NetworkManager[54954]: <info>  [1769121596.7564] device (tapfbb96457-40): carrier: link connected
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.766 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[55916d7b-20d9-491d-a128-0301f3f83f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.787 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9b44a2a0-ecc6-4b3b-a3c2-ec6d52b351b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbb96457-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:9c:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523435, 'reachable_time': 34034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230358, 'error': None, 'target': 'ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.811 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[10f096e4-729c-4875-b3cf-11f117166c53]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:9cbc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523435, 'tstamp': 523435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230359, 'error': None, 'target': 'ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.841 182729 INFO nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Creating config drive at /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk.config
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.841 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc7857f-1c31-4aaf-8217-85be17890b7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbb96457-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:9c:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523435, 'reachable_time': 34034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230360, 'error': None, 'target': 'ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.846 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqr1qfr2l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:39:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.887 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[06e73d82-e8be-49d6-820c-018e7c655309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:56 compute-0 nova_compute[182725]: 2026-01-22 22:39:56.988 182729 DEBUG oslo_concurrency.processutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqr1qfr2l" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:56.992 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9a776fc6-879d-4080-9ec8-f8e5001ca0ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.002 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbb96457-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.003 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.003 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbb96457-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:57 compute-0 NetworkManager[54954]: <info>  [1769121597.0091] manager: (tapfbb96457-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.013 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:57 compute-0 kernel: tapfbb96457-40: entered promiscuous mode
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.028 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbb96457-40, col_values=(('external_ids', {'iface-id': 'cf3a3ad2-5afe-400f-b31e-2a0edf61e11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:57 compute-0 ovn_controller[94850]: 2026-01-22T22:39:57Z|00513|binding|INFO|Releasing lport cf3a3ad2-5afe-400f-b31e-2a0edf61e11b from this chassis (sb_readonly=0)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.031 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.057 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.063 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.064 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbb96457-41f3-4931-8421-59ae568f6512.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbb96457-41f3-4931-8421-59ae568f6512.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.065 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9594fef3-5cad-4239-8f9a-27388e495bb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.066 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-fbb96457-41f3-4931-8421-59ae568f6512
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/fbb96457-41f3-4931-8421-59ae568f6512.pid.haproxy
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID fbb96457-41f3-4931-8421-59ae568f6512
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.067 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512', 'env', 'PROCESS_TAG=haproxy-fbb96457-41f3-4931-8421-59ae568f6512', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbb96457-41f3-4931-8421-59ae568f6512.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:39:57 compute-0 NetworkManager[54954]: <info>  [1769121597.0816] manager: (tap427a76c6-97): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Jan 22 22:39:57 compute-0 kernel: tap427a76c6-97: entered promiscuous mode
Jan 22 22:39:57 compute-0 systemd-udevd[230341]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.084 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:57 compute-0 ovn_controller[94850]: 2026-01-22T22:39:57Z|00514|binding|INFO|Claiming lport 427a76c6-9759-412b-9f78-6d0e033fa0c9 for this chassis.
Jan 22 22:39:57 compute-0 ovn_controller[94850]: 2026-01-22T22:39:57Z|00515|binding|INFO|427a76c6-9759-412b-9f78-6d0e033fa0c9: Claiming fa:16:3e:58:e4:66 10.100.0.10
Jan 22 22:39:57 compute-0 NetworkManager[54954]: <info>  [1769121597.1012] device (tap427a76c6-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:39:57 compute-0 NetworkManager[54954]: <info>  [1769121597.1021] device (tap427a76c6-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.104 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:e4:66 10.100.0.10'], port_security=['fa:16:3e:58:e4:66 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9273e299-fee4-42e3-a2d9-f8b355cc5cfe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '06798119-3cf9-4579-b6fe-7ef0a3f57792', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af00d925-c6e8-4c1e-8ae7-75c6556913d1, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=427a76c6-9759-412b-9f78-6d0e033fa0c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:39:57 compute-0 ovn_controller[94850]: 2026-01-22T22:39:57Z|00516|binding|INFO|Setting lport 427a76c6-9759-412b-9f78-6d0e033fa0c9 ovn-installed in OVS
Jan 22 22:39:57 compute-0 ovn_controller[94850]: 2026-01-22T22:39:57Z|00517|binding|INFO|Setting lport 427a76c6-9759-412b-9f78-6d0e033fa0c9 up in Southbound
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.106 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.114 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:57 compute-0 systemd-machined[154006]: New machine qemu-61-instance-00000085.
Jan 22 22:39:57 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-00000085.
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.228 182729 DEBUG nova.compute.manager [req-59b72c7e-243f-496f-b45b-2f590c3f488c req-3cb2f261-e0f8-448a-bc8c-81192b623a6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.230 182729 DEBUG oslo_concurrency.lockutils [req-59b72c7e-243f-496f-b45b-2f590c3f488c req-3cb2f261-e0f8-448a-bc8c-81192b623a6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.230 182729 DEBUG oslo_concurrency.lockutils [req-59b72c7e-243f-496f-b45b-2f590c3f488c req-3cb2f261-e0f8-448a-bc8c-81192b623a6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.231 182729 DEBUG oslo_concurrency.lockutils [req-59b72c7e-243f-496f-b45b-2f590c3f488c req-3cb2f261-e0f8-448a-bc8c-81192b623a6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.231 182729 DEBUG nova.compute.manager [req-59b72c7e-243f-496f-b45b-2f590c3f488c req-3cb2f261-e0f8-448a-bc8c-81192b623a6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Processing event network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:39:57 compute-0 podman[230391]: 2026-01-22 22:39:57.282304554 +0000 UTC m=+0.087758942 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.303 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.305 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121597.3024418, e6a1471a-80f0-43ff-95e0-b865b6134ab6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.305 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] VM Started (Lifecycle Event)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.316 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.322 182729 INFO nova.virt.libvirt.driver [-] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Instance spawned successfully.
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.323 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.341 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.350 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.357 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.358 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.359 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.360 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.362 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.363 182729 DEBUG nova.virt.libvirt.driver [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.380 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.381 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121597.3041873, e6a1471a-80f0-43ff-95e0-b865b6134ab6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.381 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] VM Paused (Lifecycle Event)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.425 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.429 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121597.3145244, e6a1471a-80f0-43ff-95e0-b865b6134ab6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.430 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] VM Resumed (Lifecycle Event)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.455 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.459 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.493 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:57 compute-0 podman[230442]: 2026-01-22 22:39:57.496289825 +0000 UTC m=+0.069239169 container create 318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.499 182729 INFO nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Took 11.39 seconds to spawn the instance on the hypervisor.
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.499 182729 DEBUG nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:57 compute-0 systemd[1]: Started libpod-conmon-318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78.scope.
Jan 22 22:39:57 compute-0 podman[230442]: 2026-01-22 22:39:57.468139992 +0000 UTC m=+0.041089327 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.585 182729 INFO nova.compute.manager [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Took 12.37 seconds to build instance.
Jan 22 22:39:57 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:39:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66fbdb007b9fdaabd9e148d492367a94ba33deefd4161989bb74ad36c8592944/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:39:57 compute-0 podman[230442]: 2026-01-22 22:39:57.606008784 +0000 UTC m=+0.178958178 container init 318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.609 182729 DEBUG oslo_concurrency.lockutils [None req-0d504c48-8836-4561-80b3-443510855810 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.494s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:57 compute-0 podman[230442]: 2026-01-22 22:39:57.612376053 +0000 UTC m=+0.185325397 container start 318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:39:57 compute-0 neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512[230457]: [NOTICE]   (230461) : New worker (230463) forked
Jan 22 22:39:57 compute-0 neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512[230457]: [NOTICE]   (230461) : Loading success.
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.702 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 427a76c6-9759-412b-9f78-6d0e033fa0c9 in datapath 17ab2e5b-049b-4984-a18a-6b3e44614ef5 unbound from our chassis
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.704 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ab2e5b-049b-4984-a18a-6b3e44614ef5
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.715 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ce970313-ac61-41c6-992b-81c14edbb871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.716 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ab2e5b-01 in ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.719 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ab2e5b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.719 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b7535902-0744-4377-8700-37d5b1443e99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.720 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cf2840-2b92-4c0a-9441-ff02c9761820]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.737 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[ad8e176a-7aa1-48b0-bd03-38a71fe66157]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.752 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[34d69261-d02c-4320-8763-c08c89934fac]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.802 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d501b5a4-3ad7-4c99-992f-0e20f6cee642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.808 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[70583008-9c4f-493e-aaa4-618f5a6dc231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 NetworkManager[54954]: <info>  [1769121597.8104] manager: (tap17ab2e5b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.850 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[cc257f60-5486-4144-9148-730238eedc56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.853 182729 DEBUG nova.network.neutron [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updated VIF entry in instance network info cache for port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.855 182729 DEBUG nova.network.neutron [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.857 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[408371d6-3132-49bb-b1bc-6dc2c958f3cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.881 182729 DEBUG oslo_concurrency.lockutils [req-acbdf58f-0a6f-417f-a491-05660773e8e0 req-167adf3b-f12a-4d55-a105-21f8b6080680 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:57 compute-0 NetworkManager[54954]: <info>  [1769121597.8860] device (tap17ab2e5b-00): carrier: link connected
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.893 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a34739-257d-4ec1-820a-43c1a2dfd9f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.914 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2c352459-d72f-4a65-9d55-6176360b3a5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ab2e5b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:d4:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523548, 'reachable_time': 24384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230488, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.931 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[085f08fb-8c37-444b-a3cd-24b56068a096]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:d4d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523548, 'tstamp': 523548}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230490, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.940 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.956 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[40338c9a-b991-4d56-9413-75bf17e915ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ab2e5b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:d4:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523548, 'reachable_time': 24384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230491, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.979 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121597.978744, 9273e299-fee4-42e3-a2d9-f8b355cc5cfe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:57 compute-0 nova_compute[182725]: 2026-01-22 22:39:57.979 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] VM Started (Lifecycle Event)
Jan 22 22:39:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:57.988 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[588c4a73-1229-4e86-9030-09ba1094a811]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.006 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.010 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121597.978974, 9273e299-fee4-42e3-a2d9-f8b355cc5cfe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.011 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] VM Paused (Lifecycle Event)
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.047 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.047 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[825744ff-8af7-4059-b65b-18ec6122efd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.049 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ab2e5b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.049 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.050 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ab2e5b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.050 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:58 compute-0 kernel: tap17ab2e5b-00: entered promiscuous mode
Jan 22 22:39:58 compute-0 NetworkManager[54954]: <info>  [1769121598.0526] manager: (tap17ab2e5b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.052 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.058 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.060 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ab2e5b-00, col_values=(('external_ids', {'iface-id': 'e1725d3a-3bc9-46b5-a1d1-153d0147aff7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:39:58 compute-0 ovn_controller[94850]: 2026-01-22T22:39:58Z|00518|binding|INFO|Releasing lport e1725d3a-3bc9-46b5-a1d1-153d0147aff7 from this chassis (sb_readonly=0)
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.063 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.062 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.066 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8223502f-1275-4fdb-8cd7-567a1a6bbf62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.067 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-17ab2e5b-049b-4984-a18a-6b3e44614ef5
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 17ab2e5b-049b-4984-a18a-6b3e44614ef5
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:39:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:39:58.067 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'env', 'PROCESS_TAG=haproxy-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ab2e5b-049b-4984-a18a-6b3e44614ef5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.073 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.094 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.177 182729 DEBUG nova.compute.manager [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-unplugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.177 182729 DEBUG oslo_concurrency.lockutils [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.177 182729 DEBUG oslo_concurrency.lockutils [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.177 182729 DEBUG oslo_concurrency.lockutils [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.178 182729 DEBUG nova.compute.manager [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] No waiting events found dispatching network-vif-unplugged-9a6dc28c-828d-435c-b619-7c51693137c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.178 182729 DEBUG nova.compute.manager [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-unplugged-9a6dc28c-828d-435c-b619-7c51693137c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.178 182729 DEBUG nova.compute.manager [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.178 182729 DEBUG oslo_concurrency.lockutils [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.178 182729 DEBUG oslo_concurrency.lockutils [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.178 182729 DEBUG oslo_concurrency.lockutils [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.178 182729 DEBUG nova.compute.manager [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] No waiting events found dispatching network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.179 182729 WARNING nova.compute.manager [req-c224308b-4bb3-40fe-8164-88f5d41b1eec req-628a007a-cb9d-4540-be25-e9192fdad4d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received unexpected event network-vif-plugged-9a6dc28c-828d-435c-b619-7c51693137c4 for instance with vm_state active and task_state deleting.
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.369 182729 DEBUG nova.network.neutron [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.429 182729 INFO nova.compute.manager [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Took 1.94 seconds to deallocate network for instance.
Jan 22 22:39:58 compute-0 podman[230523]: 2026-01-22 22:39:58.489153029 +0000 UTC m=+0.048622655 container create 921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:39:58 compute-0 systemd[1]: Started libpod-conmon-921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522.scope.
Jan 22 22:39:58 compute-0 podman[230523]: 2026-01-22 22:39:58.464988735 +0000 UTC m=+0.024458361 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.563 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.563 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:58 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:39:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed027b97421cbf4f3d77bf6cd787e307f54f2cb18cbeb35a5952cf05c4a76a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:39:58 compute-0 podman[230523]: 2026-01-22 22:39:58.607893233 +0000 UTC m=+0.167362889 container init 921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 22:39:58 compute-0 podman[230523]: 2026-01-22 22:39:58.613710038 +0000 UTC m=+0.173179674 container start 921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 22:39:58 compute-0 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[230538]: [NOTICE]   (230542) : New worker (230544) forked
Jan 22 22:39:58 compute-0 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[230538]: [NOTICE]   (230542) : Loading success.
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.676 182729 DEBUG nova.network.neutron [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Updated VIF entry in instance network info cache for port 427a76c6-9759-412b-9f78-6d0e033fa0c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.676 182729 DEBUG nova.network.neutron [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Updating instance_info_cache with network_info: [{"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.694 182729 DEBUG oslo_concurrency.lockutils [req-83ec419b-89b2-4605-97f2-ea05212a2452 req-057b3597-0317-426a-85c2-0c69b875f63f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-9273e299-fee4-42e3-a2d9-f8b355cc5cfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.756 182729 DEBUG nova.compute.provider_tree [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.775 182729 DEBUG nova.scheduler.client.report [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.798 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.834 182729 INFO nova.scheduler.client.report [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Deleted allocations for instance dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f
Jan 22 22:39:58 compute-0 nova_compute[182725]: 2026-01-22 22:39:58.934 182729 DEBUG oslo_concurrency.lockutils [None req-9c75e20f-0bb6-45a5-8fe9-8dd3f0940b89 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.008 182729 DEBUG nova.network.neutron [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updated VIF entry in instance network info cache for port 9a6dc28c-828d-435c-b619-7c51693137c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.009 182729 DEBUG nova.network.neutron [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Updating instance_info_cache with network_info: [{"id": "9a6dc28c-828d-435c-b619-7c51693137c4", "address": "fa:16:3e:2d:f6:e8", "network": {"id": "7a53a2bd-56e7-4238-a2c4-69b2eaef9fb9", "bridge": "br-int", "label": "tempest-network-smoke--1816498635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a6dc28c-82", "ovs_interfaceid": "9a6dc28c-828d-435c-b619-7c51693137c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.027 182729 DEBUG oslo_concurrency.lockutils [req-a886e92c-2c90-43f7-a145-ba0f657c2d31 req-29b59dff-9545-4547-8249-c93c6dc10d1b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.187 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.367 182729 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.368 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.368 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.368 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.368 182729 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] No waiting events found dispatching network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.369 182729 WARNING nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received unexpected event network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 for instance with vm_state active and task_state None.
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.369 182729 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Received event network-vif-deleted-9a6dc28c-828d-435c-b619-7c51693137c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.369 182729 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received event network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.369 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.369 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.370 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.370 182729 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Processing event network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.370 182729 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received event network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.371 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.371 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.372 182729 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.372 182729 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] No waiting events found dispatching network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.373 182729 WARNING nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received unexpected event network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 for instance with vm_state building and task_state spawning.
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.374 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.383 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121599.3836334, 9273e299-fee4-42e3-a2d9-f8b355cc5cfe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.385 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] VM Resumed (Lifecycle Event)
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.388 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.393 182729 INFO nova.virt.libvirt.driver [-] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Instance spawned successfully.
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.393 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.428 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.430 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.430 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.431 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.431 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.432 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.433 182729 DEBUG nova.virt.libvirt.driver [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.438 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.474 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.521 182729 INFO nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Took 12.57 seconds to spawn the instance on the hypervisor.
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.522 182729 DEBUG nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.722 182729 INFO nova.compute.manager [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Took 13.56 seconds to build instance.
Jan 22 22:39:59 compute-0 nova_compute[182725]: 2026-01-22 22:39:59.775 182729 DEBUG oslo_concurrency.lockutils [None req-7da58232-963a-4f6b-95e9-be72e3f56e28 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.303 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.931 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.933 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.933 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.934 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.934 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.949 182729 INFO nova.compute.manager [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Terminating instance
Jan 22 22:40:01 compute-0 nova_compute[182725]: 2026-01-22 22:40:01.964 182729 DEBUG nova.compute.manager [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:40:01 compute-0 kernel: tap427a76c6-97 (unregistering): left promiscuous mode
Jan 22 22:40:01 compute-0 NetworkManager[54954]: <info>  [1769121601.9991] device (tap427a76c6-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.013 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:02 compute-0 ovn_controller[94850]: 2026-01-22T22:40:02Z|00519|binding|INFO|Releasing lport 427a76c6-9759-412b-9f78-6d0e033fa0c9 from this chassis (sb_readonly=0)
Jan 22 22:40:02 compute-0 ovn_controller[94850]: 2026-01-22T22:40:02Z|00520|binding|INFO|Setting lport 427a76c6-9759-412b-9f78-6d0e033fa0c9 down in Southbound
Jan 22 22:40:02 compute-0 ovn_controller[94850]: 2026-01-22T22:40:02Z|00521|binding|INFO|Removing iface tap427a76c6-97 ovn-installed in OVS
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.022 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:e4:66 10.100.0.10'], port_security=['fa:16:3e:58:e4:66 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9273e299-fee4-42e3-a2d9-f8b355cc5cfe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '06798119-3cf9-4579-b6fe-7ef0a3f57792', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af00d925-c6e8-4c1e-8ae7-75c6556913d1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=427a76c6-9759-412b-9f78-6d0e033fa0c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.024 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 427a76c6-9759-412b-9f78-6d0e033fa0c9 in datapath 17ab2e5b-049b-4984-a18a-6b3e44614ef5 unbound from our chassis
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.026 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ab2e5b-049b-4984-a18a-6b3e44614ef5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.027 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[de1f694a-f9c3-49d9-a82b-24b1d7895af1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.031 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 namespace which is not needed anymore
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.052 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:02 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000085.scope: Deactivated successfully.
Jan 22 22:40:02 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000085.scope: Consumed 3.365s CPU time.
Jan 22 22:40:02 compute-0 systemd-machined[154006]: Machine qemu-61-instance-00000085 terminated.
Jan 22 22:40:02 compute-0 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[230538]: [NOTICE]   (230542) : haproxy version is 2.8.14-c23fe91
Jan 22 22:40:02 compute-0 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[230538]: [NOTICE]   (230542) : path to executable is /usr/sbin/haproxy
Jan 22 22:40:02 compute-0 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[230538]: [WARNING]  (230542) : Exiting Master process...
Jan 22 22:40:02 compute-0 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[230538]: [ALERT]    (230542) : Current worker (230544) exited with code 143 (Terminated)
Jan 22 22:40:02 compute-0 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[230538]: [WARNING]  (230542) : All workers exited. Exiting... (0)
Jan 22 22:40:02 compute-0 systemd[1]: libpod-921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522.scope: Deactivated successfully.
Jan 22 22:40:02 compute-0 podman[230578]: 2026-01-22 22:40:02.229447321 +0000 UTC m=+0.070435879 container died 921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.244 182729 INFO nova.virt.libvirt.driver [-] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Instance destroyed successfully.
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.245 182729 DEBUG nova.objects.instance [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'resources' on Instance uuid 9273e299-fee4-42e3-a2d9-f8b355cc5cfe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.258 182729 DEBUG nova.virt.libvirt.vif [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-810446562',display_name='tempest-ServersNegativeTestJSON-server-810446562',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-810446562',id=133,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5906f64d8ee84f068ff9caa68ae3652b',ramdisk_id='',reservation_id='r-tnoa5gq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-2095273166',owner_user_name='tempest-ServersNegativeTestJSON-2095273166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:59Z,user_data=None,user_id='45cd11974e6648e1872fb5ebf9dee0b1',uuid=9273e299-fee4-42e3-a2d9-f8b355cc5cfe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.259 182729 DEBUG nova.network.os_vif_util [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converting VIF {"id": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "address": "fa:16:3e:58:e4:66", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap427a76c6-97", "ovs_interfaceid": "427a76c6-9759-412b-9f78-6d0e033fa0c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.260 182729 DEBUG nova.network.os_vif_util [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:e4:66,bridge_name='br-int',has_traffic_filtering=True,id=427a76c6-9759-412b-9f78-6d0e033fa0c9,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap427a76c6-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.260 182729 DEBUG os_vif [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:e4:66,bridge_name='br-int',has_traffic_filtering=True,id=427a76c6-9759-412b-9f78-6d0e033fa0c9,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap427a76c6-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.262 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.262 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap427a76c6-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.264 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.266 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:40:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522-userdata-shm.mount: Deactivated successfully.
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.269 182729 INFO os_vif [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:e4:66,bridge_name='br-int',has_traffic_filtering=True,id=427a76c6-9759-412b-9f78-6d0e033fa0c9,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap427a76c6-97')
Jan 22 22:40:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-2ed027b97421cbf4f3d77bf6cd787e307f54f2cb18cbeb35a5952cf05c4a76a2-merged.mount: Deactivated successfully.
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.270 182729 INFO nova.virt.libvirt.driver [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Deleting instance files /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe_del
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.271 182729 INFO nova.virt.libvirt.driver [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Deletion of /var/lib/nova/instances/9273e299-fee4-42e3-a2d9-f8b355cc5cfe_del complete
Jan 22 22:40:02 compute-0 podman[230578]: 2026-01-22 22:40:02.28627461 +0000 UTC m=+0.127263168 container cleanup 921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 22:40:02 compute-0 systemd[1]: libpod-conmon-921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522.scope: Deactivated successfully.
Jan 22 22:40:02 compute-0 podman[230617]: 2026-01-22 22:40:02.322859433 +0000 UTC m=+0.077858815 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64)
Jan 22 22:40:02 compute-0 podman[230648]: 2026-01-22 22:40:02.35524932 +0000 UTC m=+0.046324436 container remove 921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:40:02 compute-0 podman[230608]: 2026-01-22 22:40:02.357509397 +0000 UTC m=+0.112500688 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.367 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[50281eae-bcac-4ded-9e6c-77f2ea55e4f8]: (4, ('Thu Jan 22 10:40:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 (921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522)\n921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522\nThu Jan 22 10:40:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 (921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522)\n921a6231cb6107f8ecb6d707dee8f58b187c3522fffdc527b5ca1e17522ef522\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.370 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b248f820-c336-4eca-b42b-0086c06ea44e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.371 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ab2e5b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:02 compute-0 kernel: tap17ab2e5b-00: left promiscuous mode
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.376 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.379 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d6e5ba-1f64-4f9c-b368-65c8d21f71af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.384 182729 INFO nova.compute.manager [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.384 182729 DEBUG oslo.service.loopingcall [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.385 182729 DEBUG nova.compute.manager [-] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.385 182729 DEBUG nova.network.neutron [-] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.394 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.397 182729 DEBUG nova.compute.manager [req-38747f3d-c3e4-4e92-9869-68c4d5a8a2a7 req-0ea28ddd-0c84-467e-8247-ffef3d4a921c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received event network-vif-unplugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.398 182729 DEBUG oslo_concurrency.lockutils [req-38747f3d-c3e4-4e92-9869-68c4d5a8a2a7 req-0ea28ddd-0c84-467e-8247-ffef3d4a921c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.398 182729 DEBUG oslo_concurrency.lockutils [req-38747f3d-c3e4-4e92-9869-68c4d5a8a2a7 req-0ea28ddd-0c84-467e-8247-ffef3d4a921c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.398 182729 DEBUG oslo_concurrency.lockutils [req-38747f3d-c3e4-4e92-9869-68c4d5a8a2a7 req-0ea28ddd-0c84-467e-8247-ffef3d4a921c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.398 182729 DEBUG nova.compute.manager [req-38747f3d-c3e4-4e92-9869-68c4d5a8a2a7 req-0ea28ddd-0c84-467e-8247-ffef3d4a921c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] No waiting events found dispatching network-vif-unplugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:02 compute-0 nova_compute[182725]: 2026-01-22 22:40:02.399 182729 DEBUG nova.compute.manager [req-38747f3d-c3e4-4e92-9869-68c4d5a8a2a7 req-0ea28ddd-0c84-467e-8247-ffef3d4a921c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received event network-vif-unplugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.399 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fa65db-b732-4cad-a97f-2661e216cec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.401 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccae912-ec6a-4f84-8c19-6d45982e7507]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.417 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1936f302-f849-4184-8d8d-feb2efce2c63]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523539, 'reachable_time': 37205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230680, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.420 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:40:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:02.420 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7a0e19-a148-4907-93bf-80c6c29edeec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d17ab2e5b\x2d049b\x2d4984\x2da18a\x2d6b3e44614ef5.mount: Deactivated successfully.
Jan 22 22:40:03 compute-0 ovn_controller[94850]: 2026-01-22T22:40:03Z|00522|binding|INFO|Releasing lport cf3a3ad2-5afe-400f-b31e-2a0edf61e11b from this chassis (sb_readonly=0)
Jan 22 22:40:03 compute-0 ovn_controller[94850]: 2026-01-22T22:40:03Z|00523|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 22:40:03 compute-0 nova_compute[182725]: 2026-01-22 22:40:03.109 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:03 compute-0 nova_compute[182725]: 2026-01-22 22:40:03.534 182729 DEBUG nova.compute.manager [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-changed-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:03 compute-0 nova_compute[182725]: 2026-01-22 22:40:03.536 182729 DEBUG nova.compute.manager [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing instance network info cache due to event network-changed-22e0ead7-6f30-4530-8c7a-18ca9aeeab12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:40:03 compute-0 nova_compute[182725]: 2026-01-22 22:40:03.536 182729 DEBUG oslo_concurrency.lockutils [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:03 compute-0 nova_compute[182725]: 2026-01-22 22:40:03.537 182729 DEBUG oslo_concurrency.lockutils [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:03 compute-0 nova_compute[182725]: 2026-01-22 22:40:03.538 182729 DEBUG nova.network.neutron [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing network info cache for port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.045 182729 DEBUG nova.network.neutron [-] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.071 182729 INFO nova.compute.manager [-] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Took 1.69 seconds to deallocate network for instance.
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.147 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.148 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.190 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.231 182729 DEBUG nova.compute.manager [req-561aad7d-3f57-4565-bd11-5a86888866c6 req-ec20911c-46a8-4b2b-b2d2-b988b33ef90a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received event network-vif-deleted-427a76c6-9759-412b-9f78-6d0e033fa0c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.301 182729 DEBUG nova.compute.provider_tree [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.342 182729 DEBUG nova.scheduler.client.report [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.379 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.432 182729 INFO nova.scheduler.client.report [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Deleted allocations for instance 9273e299-fee4-42e3-a2d9-f8b355cc5cfe
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.584 182729 DEBUG nova.compute.manager [req-1efd3f29-e90f-4d34-bea4-0abada87c887 req-03401c39-9d93-447c-b547-9497c5c7cc8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received event network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.585 182729 DEBUG oslo_concurrency.lockutils [req-1efd3f29-e90f-4d34-bea4-0abada87c887 req-03401c39-9d93-447c-b547-9497c5c7cc8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.586 182729 DEBUG oslo_concurrency.lockutils [req-1efd3f29-e90f-4d34-bea4-0abada87c887 req-03401c39-9d93-447c-b547-9497c5c7cc8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.587 182729 DEBUG oslo_concurrency.lockutils [req-1efd3f29-e90f-4d34-bea4-0abada87c887 req-03401c39-9d93-447c-b547-9497c5c7cc8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.587 182729 DEBUG nova.compute.manager [req-1efd3f29-e90f-4d34-bea4-0abada87c887 req-03401c39-9d93-447c-b547-9497c5c7cc8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] No waiting events found dispatching network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.588 182729 WARNING nova.compute.manager [req-1efd3f29-e90f-4d34-bea4-0abada87c887 req-03401c39-9d93-447c-b547-9497c5c7cc8b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Received unexpected event network-vif-plugged-427a76c6-9759-412b-9f78-6d0e033fa0c9 for instance with vm_state deleted and task_state None.
Jan 22 22:40:04 compute-0 nova_compute[182725]: 2026-01-22 22:40:04.601 182729 DEBUG oslo_concurrency.lockutils [None req-d6322a0f-5ff3-4ce5-8616-fd21cbc0a944 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "9273e299-fee4-42e3-a2d9-f8b355cc5cfe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:05 compute-0 nova_compute[182725]: 2026-01-22 22:40:05.368 182729 DEBUG nova.network.neutron [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updated VIF entry in instance network info cache for port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:40:05 compute-0 nova_compute[182725]: 2026-01-22 22:40:05.370 182729 DEBUG nova.network.neutron [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:05 compute-0 nova_compute[182725]: 2026-01-22 22:40:05.418 182729 DEBUG oslo_concurrency.lockutils [req-7fe3fb75-bf7f-4ec0-aaf7-49f47fbb9741 req-a206d026-23d3-494b-b93c-082a5615eb97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:07 compute-0 nova_compute[182725]: 2026-01-22 22:40:07.266 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.115 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000081', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'abdd987d004046138277253df8658aca', 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'hostId': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.121 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '254e913f-3968-436b-afcc-e51c2350b232', 'name': 'tempest-ServerActionsTestOtherB-server-424272542', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'abdd987d004046138277253df8658aca', 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'hostId': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.125 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'name': 'tempest-TestNetworkBasicOps-server-80512993', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000084', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ffd58948cb444c25ae034a02c0344de7', 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'hostId': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.125 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.149 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/memory.usage volume: 46.515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.164 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/memory.usage volume: 41.90625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.187 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59ccc468-f3fa-4fb1-9c13-d774580e9ddb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.515625, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'timestamp': '2026-01-22T22:40:09.125935', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4e489362-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.806535919, 'message_signature': '906af007ff7e4ad59202a69ec0726aa6190c8c86d7e34cffe480d9aaabbd7e34'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 41.90625, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232', 'timestamp': '2026-01-22T22:40:09.125935', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4e4acbc8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.821654447, 'message_signature': 'e30b0239c91a2983e3fdab3066ab88543c7ba77a5a12d9df631bd44f03ee1c51'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'timestamp': '2026-01-22T22:40:09.125935', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4e4e5e3c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.844757404, 'message_signature': '98881f44ed3074709bce6033d6c36f532f9194fbe655ec501a857d14d9efa2bd'}]}, 'timestamp': '2026-01-22 22:40:09.188320', '_unique_id': '91a9109a8a8d4dd7aeea1c16eca8c9d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.191 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.191 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>]
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:40:09 compute-0 nova_compute[182725]: 2026-01-22 22:40:09.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.208 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e4a5cd94-28e4-4031-ae49-2527cbacc939 / tapd1f07ed0-8f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.208 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.212 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.215 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e6a1471a-80f0-43ff-95e0-b865b6134ab6 / tap22e0ead7-6f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.215 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97e79d39-a91a-4aac-82a0-f3818957b39e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.191528', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e519444-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '1998b60d1e92dd614b6ee4339ea354fb60877262c15e6b04dabf130b9f16fdfc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.191528', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e522832-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '2329a7f439d9cda250513ceae1866a42a1847c9541c3bcaca34d58588fa03b64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.191528', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e529bbe-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '1d12c451b98e78f03017731991da5ba09c7d7017d143fbb8c7403c77d520a2f1'}]}, 'timestamp': '2026-01-22 22:40:09.216110', '_unique_id': 'a522fcf9e4e4405a83075b10f03d33de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.217 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.259 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.write.latency volume: 2401772294 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.260 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.289 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.latency volume: 3245127195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.290 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.323 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.write.latency volume: 1553262449 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.324 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f2ee49-563f-4cd4-ac75-8f23a98cedcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2401772294, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.218432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e595e90-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '65c62f3ca2e90bf23893b2b1f16e5acd589323aa7433c6cbdd763ba04f6adf29'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.218432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e597178-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': 'fbdab3a349c137a08ff6089a41b77da86eb08990970bc4a31d4d5753e8a37dd3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3245127195, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.218432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e5de140-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': 'f053658a90fc62850e01428d7fa3911320d756db78fdd185235f6f8ec784b2ea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.218432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e5df770-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': 'fa93e15d027408deddc4c8a675ab27eda38d817da9c735d7aabab616d30239d0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1553262449, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.218432', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e631e62-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '087fe3bfff7d22e0cda0eaf93c642526a8143d8fa550065151b5a5341c1e3a1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.218432', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'dis
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: k_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e63321c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '32789b2dfa18dd6ceef7d0ddd885ea73f4ddfa58c2bcbe998e790606bea3e779'}]}, 'timestamp': '2026-01-22 22:40:09.324909', '_unique_id': '74cbfdedb8e447beb73fb91769e1eb2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.328 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.329 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.bytes.delta volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.329 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa85aba9-711f-4e4e-bdc3-ab36672aceee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.328513', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e63d76c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '23ab756c852ed1ca0d6e85f187ee39d7953ccad740f3bfda0baa46f0a8f86d93'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.328513', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e63ea22-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '57522eaf9c8832bf83b1f503531cbfe4c55b00dc74e928b6a820a7f88760aebc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.328513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e63fb70-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': 'df4cd7d7628debf25eb2624ea97e3cc24fec8d67fb6b8b31e271ee4daa11ba31'}]}, 'timestamp': '2026-01-22 22:40:09.330003', '_unique_id': 'dbbfbe0a41994ab9bcdc4b0cfc3c82d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.331 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.326 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.369 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.370 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.388 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.389 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.402 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.403 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a9113fc-0864-4d21-add2-864aeb0c53b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.332717', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e6a2d10-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.990086091, 'message_signature': '42711b0e8589f743f838c3846795f828efc5b301d057144daf35f1a2dc1ae71a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.332717', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e6a4476-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.990086091, 'message_signature': 'a766a04f55347f6b0ca9314c2579f97e7061eeee4fc6b1176f7b885221e337b0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.332717', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e6cff18-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.028463449, 'message_signature': '63a5b0cde89577bd4035367a499301abdd77c5b8e8a8830b127ba031cd522c9b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.332717', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e6d1444-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.028463449, 'message_signature': '1e62db66c4b5df2a526247c3b65527b123788223e6a9b2643f4ac53c0fbe5cac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.332717', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e6f1c6c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.046853688, 'message_signature': '2d12d5fee578171b8708853db11de2c6bd58832e64e6ecea0d5b1063d6ba6758'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.332717', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: ': '4e6f349a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.046853688, 'message_signature': '2975ef810972adb8f7d28e850f8e2a7f7e5d8815aed5227c12f748fdca2d6136'}]}, 'timestamp': '2026-01-22 22:40:09.403556', '_unique_id': '0a89381626124879ba4ee486609db631'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.407 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.read.latency volume: 180165048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.407 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.read.latency volume: 35620888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.408 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.latency volume: 139589681 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.408 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.latency volume: 21109291 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.409 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.read.latency volume: 154754292 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.409 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.read.latency volume: 23641639 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8304f8ec-d3f3-40fb-bfb4-8bf75227d106', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 180165048, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.407064', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e6fd224-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '6570b69476361a1cb1889c4fa4c69eec0b6743a64f1361d9c89d8212e71a418a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35620888, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.407064', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e6fe610-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '2f431918c46581797f0e7cf2c0206943d6be3642a50c20008879d1b58df12ad0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 139589681, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.407064', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e6ff786-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': '606bf02fa187cb27907393dc85b8c2c4e490f542e44c48d8f131530ef999c6f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21109291, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.407064', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e700a00-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': '49bcb7d34eb914dcfeff5c6a8800e31e9157011ad5983402bd36d897159c1092'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 154754292, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.407064', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e701f36-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '8df0191fae37dab438628692e986307eff97c99dc00f62d5685608ad7c30ff02'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23641639, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.407064', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: ': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e702fee-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': 'fb6056a6d08da327b2e374b2ec800d7ee188ac212d7efc9fab4855a100023385'}]}, 'timestamp': '2026-01-22 22:40:09.410009', '_unique_id': 'fe3a6a7091d74adbbaf9cf200429a35e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.413 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.413 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>]
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.413 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.413 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.414 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.414 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.415 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.415 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.416 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1118fc37-c476-40e7-8748-386824d28a4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.413853', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e70db1a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.990086091, 'message_signature': 'dbac47d242af00a0a2d256c5c64813e07de23ececf4ad16991f9326a802c32f5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.413853', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e70ec72-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.990086091, 'message_signature': '3bb1f7640319d13011c4908460f1fcb23af3519c399155f1dfb9ea4286ee7dd6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.413853', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e70fe56-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.028463449, 'message_signature': '6863053a296ec6edde7f792f668f77eb617c628d982f0adbee59e9bc3a290095'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.413853', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e710e64-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.028463449, 'message_signature': '0e5d2684f62c201fae6169891985928f206273970513837e6c097c8877a0577f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.413853', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e712156-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.046853688, 'message_signature': '15e156f24a823d98956ce62ca8a7fabff035a5450d4132f4b5b41d02cbfc8791'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.413853', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_na
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: me': 'sda'}, 'message_id': '4e7131f0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.046853688, 'message_signature': 'cc4afc1ce2621e06731045d31b8b44f4123e52d3a25fbae817e2c09fc664f287'}]}, 'timestamp': '2026-01-22 22:40:09.416591', '_unique_id': 'a4d2af98a4944081b602b7e14da47921'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.419 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.419 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.read.requests volume: 1110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.419 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.420 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.requests volume: 1049 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.420 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.421 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.read.requests volume: 942 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.421 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '428a54b6-9b20-4836-95b5-4cf6beec5ac7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1110, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.419420', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e71b65c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': 'ebdaa253d74648822e3c35109d223ab16a996d9718aa83e56be7bb79336cf59f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.419420', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e71c9b2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '32f27cd6e2e874298928c4e9a808c9305e402ffc4ceb86906f09a831053f9e2a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1049, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.419420', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e71da06-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': '715988504cb59b943f014c0398aa0323f6be91b57db98b6d34a1e7d4cd1cb501'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.419420', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e71edf2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': 'd267df20aabdd9c17e736434d44df4847b4aaf29f29533fdce9a047a878fa1ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 942, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.419420', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e71febe-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': 'dba8ceef681ca205f49b49bbdce404d8680d7136e63250daea6c9b73f228e6e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.419420', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memor
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: y_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e72107a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '0c6dc8de85250a8bea197687280fc79316b99020f4265a84633190445c7e75d6'}]}, 'timestamp': '2026-01-22 22:40:09.422242', '_unique_id': '8978ab1995c84885af1cd4498cd337e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.424 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.write.bytes volume: 73097216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.425 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.425 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.426 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.426 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.427 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c00a631f-a168-4a5c-ae28-b2d15bb0fe2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73097216, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.424933', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e728bea-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': 'ae07aa1558639395acca632974c8210621cb01bb2657df3e28be20e69da4904b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.424933', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e729fea-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '4a32e2683489af29901c9eadc14150ad2bc49e2f0a726c0c02a1fa9eccc9c628'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.424933', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e72b228-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': '04b98f4211f110da95070cdacbf1cbb600f425c1615b064089c40472022fb9cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.424933', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e72c222-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': '88ae3146f34359ed6bbbf5ddbaccd696f02a1e8dbb1090e8767f647f68cae28f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.424933', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e72d37a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '2f0aab012bf3c86fea1175b4cb179211585d5b00901fc323b7549e40d6a5b5aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.424933', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb'
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: : 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e72e36a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': 'c20b89ccd23f83a6c22de2e4b6cb70caa7e81509a8b74c1448b3268059394f7d'}]}, 'timestamp': '2026-01-22 22:40:09.427640', '_unique_id': '656b20a9489f4b3dbb1c65a2822d0e68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.430 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.read.bytes volume: 30755328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.430 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.431 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.bytes volume: 29161984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.431 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.432 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.read.bytes volume: 27249664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.432 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11d6b61e-d013-4987-bab3-3e4c57dba2c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30755328, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.430349', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e735ee4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '1a01d7ce68498d7cdbf71fcf3b513fa74805adf8023a8234c8b2fba83c54981f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.430349', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e73714a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': 'e0f6d56761f271572492b7ea0de4db62c89a81388073a031df85cce31c4f4b5c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29161984, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.430349', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e738158-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': 'b1260186958ff772a583ef58428a3ba9ea16458702dd70e8668fb5894b6349a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.430349', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e73949a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': 'e41ae0e0f97445a4e00ed1050795616738251fa90cad167dbecc791a859c43ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27249664, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.430349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e73a4c6-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '5097ec42d3b21cb58fadfd67a8037ce6b6807fa847bc9ecefa7e733df34da9ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.430349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephe
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: meral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e73b22c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '7d20131397daa44f5678f3c2a0f16e0b0de92ffc9d1cfd1637bf8290704d96ed'}]}, 'timestamp': '2026-01-22 22:40:09.432881', '_unique_id': '2541f3195f4b4e1f85c16e08b0b0a170'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.434 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.434 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.435 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.435 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91cab022-e06b-4738-b008-f6ec798ce264', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.434713', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e7409b6-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '617b1967d26dd8531b0157e83771be898ac296977a5abc19bc62a0c630055ab5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.434713', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e74150a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '4e91d0de16661a5548972784408258f6a0632e5c71e91a892a2d058224b20f13'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.434713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e741fbe-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '848748f72def798870e88cc1d18558fc0d9c17f0408a954e3097a1f44cbd791a'}]}, 'timestamp': '2026-01-22 22:40:09.435674', '_unique_id': '571d2556354843e2a5b829e441d26786'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.436 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.437 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.437 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.437 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>]
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.437 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.438 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.438 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0eeb48f-2fef-430e-bf31-df2fb65a44fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.437691', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e747acc-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '91a7963cd6080d9b117be0c5d05dc1ff762113a48cf66dcf8971e73ff0537288'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.437691', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e748602-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '42642e0b2f9e2a86ccc9f1edaa966500f8e582de909a475f64a727bad78039a6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.437691', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e74908e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '0e5aee39cf8b1afe76624d75ef1db98e84c6ed1176e7491e123372ead4310675'}]}, 'timestamp': '2026-01-22 22:40:09.438560', '_unique_id': '52c313d1ede045478d0cd25c861ec18e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.440 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.440 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/cpu volume: 11430000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.440 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/cpu volume: 11240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.440 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/cpu volume: 10860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c769d28-a688-471b-982d-aae247677c0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11430000000, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'timestamp': '2026-01-22T22:40:09.440112', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4e74d86e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.806535919, 'message_signature': '8b67ae49a3728ef4e9127cc453d8a568b4c40f2ca18d0f44e0b7caaea60b2ae2'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11240000000, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232', 'timestamp': '2026-01-22T22:40:09.440112', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4e74e2b4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.821654447, 'message_signature': '6859cde4b2a1c2333825ddc8d655214bba087073be6b326bcc3b6254ef1cfddb'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10860000000, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'timestamp': '2026-01-22T22:40:09.440112', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4e74ed90-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.844757404, 'message_signature': '09f96a4c90080b9e498c32e653772401667035a61fac4f34db2b14aab2ec921a'}]}, 'timestamp': '2026-01-22 22:40:09.440936', '_unique_id': 'c5845e36789d4eaa8d39e5630d2e1658'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.441 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.442 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.442 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.442 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.443 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.443 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.443 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.444 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cd1cb5d-061b-4f8d-ab60-111ebff0d222', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.442566', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e75385e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.990086091, 'message_signature': '185d3fe5da20c7eb6611267b9fc26d6927bd5a6ab7dda878e986e300dab066e3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.442566', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e7543da-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.990086091, 'message_signature': '7a14371061ae8aa40ef10bb4a9d024f01ce74e0d0623352b322154b25e57d20a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.442566', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e754fc4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.028463449, 'message_signature': 'a6d3b28a9ef4406a9022b9418ca1ee52b30fa3f673a4a0390be5d63d3cf8e519'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.442566', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e755abe-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.028463449, 'message_signature': 'ded6d653a325e3ab29d4314174b3e682457ad6a6653ad514cfb00ffec4386283'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.442566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e7567a2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.046853688, 'message_signature': '7eae9ef281b4372791eb37b294d02eb50d1c45ded25534deb3e9ee9afad4820e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.442566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'd
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: isk_name': 'sda'}, 'message_id': '4e75722e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5247.046853688, 'message_signature': '88c4230a84040e86e1d6826576a88e65c334d2d07560801cb2c9edf71155ec63'}]}, 'timestamp': '2026-01-22 22:40:09.444345', '_unique_id': 'c9ed6422a29b491ea05792502a5dd390'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.445 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.446 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.446 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6128c582-df7f-4d5b-83c2-96c3ca2a6d66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.445965', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e75bd74-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '7f66cdff80dd2b556a54650cd23a9ae287cdc64ff8877c058f3861e5a05f721c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.445965', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e75c850-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '5989cc02694de48150671c494af86b747a1757659fd709250ad583f2bcb84475'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.445965', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e75d296-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': 'e4b83a11c1d0435e45d39cf08e05ca288625ed09dbce0765c95e5f458a21018c'}]}, 'timestamp': '2026-01-22 22:40:09.446831', '_unique_id': 'e81041d76a394b7cb8114042510ad492'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.448 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.448 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.incoming.bytes volume: 4115 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.448 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.bytes volume: 1520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.448 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.incoming.bytes volume: 728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c191aab5-4eba-491e-932f-e1ea13e9b483', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4115, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.448362', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e761ae4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '99cfcae7fe3b96ca69978512cc521e2bd34865c6ebd2ad4c875f8a272f0bb35c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1520, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.448362', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e762688-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '0b6811a5b0807622133a0a84939f742796168240309cec60d5281be88ebce638'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 728, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.448362', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e76320e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '5ceb34bb4c49010fd32ea05d57823c156451f022a7d19ff86edd162e0e426ea1'}]}, 'timestamp': '2026-01-22 22:40:09.449265', '_unique_id': '135234d2d3974321af01acbb24f68292'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.450 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.451 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.bytes.delta volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.451 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5da06e8-a34a-49e5-b992-b40c6588313e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.450811', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e767b38-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': 'ef04ce99e36894a75e6a17f3bc05cc3b42c33f8128a57a124bd25b5caf30d1f1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.450811', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e76888a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '5a9cf03731387e071a279db1b45d1a948a5a7ff1bda6d27fb8e41b007aab3c58'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.450811', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e769352-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '8e3f6ef47fb1df88c7216ba5745f41acb6f5d0863c942680c2556b61bca177c6'}]}, 'timestamp': '2026-01-22 22:40:09.451858', '_unique_id': '7fd7485da12b41c1b6178ae4ec514727'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.453 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.453 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.454 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.outgoing.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e066cb7a-3f0e-4c4a-b9d2-6cd6736738ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.453584', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e76e6cc-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '164b2e9a37f9f8f0aefabbec3302880062583ccd0c930860a76a74e51277c44c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.453584', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e76f306-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '96f4a11cff2250d4def773d6c0b7e46a6e119065b9c4d8d266bbc8cac57235c2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.453584', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e76fd92-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '91c7d7211e715494a5570a7c3a3337dbaa2404ec81761dbc6386675baf2e32be'}]}, 'timestamp': '2026-01-22 22:40:09.454465', '_unique_id': '7bfd3395bc27417098364df7291cf29b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.455 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.456 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.456 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1471061737>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-80512993>]
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.456 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.456 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.457 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59dd721c-a004-47c9-9999-0bb410f9ecc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.456420', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e7755b2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': '033a3123648aabc6afa22d4cde1a901bcaceb9104b7cdd4e7522e09dddcc452d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.456420', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e77684a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': '483ce5108a7ec7371fa9370a81aa71b556d5bd9e31a287cf0225af46afa2aa48'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.456420', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e7772d6-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '0a46382e0c1fc8a983b7c4082d73c17ecb5a5e2e7f093ef68ac726ecb72d6669'}]}, 'timestamp': '2026-01-22 22:40:09.457476', '_unique_id': '697e47cd5aea4936b471ab693c689e89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.459 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.459 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.459 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.459 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.requests volume: 327 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.460 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.460 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.write.requests volume: 223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.460 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b239a2c-3fa6-400a-94bc-4fb934a31157', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 313, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-vda', 'timestamp': '2026-01-22T22:40:09.459156', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e77c09c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '82701b293e2ec394774fcf7999781a96c44f9bc3b7954b6a930f5b0df6c6fc8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939-sda', 'timestamp': '2026-01-22T22:40:09.459156', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'instance-00000081', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e77cba0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.875704016, 'message_signature': '9bf89a0458f69e7f8ac02b0bc576f45738b2c05b115f86ed30882b592d10b826'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 327, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-vda', 'timestamp': '2026-01-22T22:40:09.459156', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e77d8f2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': '41533c2b7c1085bed7b48d6113b9f7326686a9602456eddc63eae8a4fb077bdb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '254e913f-3968-436b-afcc-e51c2350b232-sda', 'timestamp': '2026-01-22T22:40:09.459156', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'instance-0000007b', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e77e2f2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.918186527, 'message_signature': 'b6e7155c13820b048cf74c99f909c84ba16d6b7711a4315b450f74569f2bdf6b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 223, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-vda', 'timestamp': '2026-01-22T22:40:09.459156', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4e77ed60-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': '1f0b4fc915119da7e67d432efe8bbba6bfc6c948ecd2a2e341ee3294a65b5189'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6-sda', 'timestamp': '2026-01-22T22:40:09.459156', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'instance-00000084', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: _mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4e77f74c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.947831626, 'message_signature': 'dc7dda58426bc47fbecd1cc3312500f557b98fd38464dfd544328d63481e6062'}]}, 'timestamp': '2026-01-22 22:40:09.460920', '_unique_id': 'c4b590f923b84cf38ed2477998f4acb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.462 12 DEBUG ceilometer.compute.pollsters [-] e4a5cd94-28e4-4031-ae49-2527cbacc939/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.463 12 DEBUG ceilometer.compute.pollsters [-] 254e913f-3968-436b-afcc-e51c2350b232/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.463 12 DEBUG ceilometer.compute.pollsters [-] e6a1471a-80f0-43ff-95e0-b865b6134ab6/network.outgoing.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51d6870e-4bae-4748-9b4b-04fe3516e71b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000081-e4a5cd94-28e4-4031-ae49-2527cbacc939-tapd1f07ed0-8f', 'timestamp': '2026-01-22T22:40:09.462833', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1471061737', 'name': 'tapd1f07ed0-8f', 'instance_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:50:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd1f07ed0-8f'}, 'message_id': '4e785098-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.848822035, 'message_signature': 'ab0fc8792cdc73a6aab19061eb0ffd66725ed5310555e0423af5e9a3c5c49bca'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-0000007b-254e913f-3968-436b-afcc-e51c2350b232-tap354f33c9-4c', 'timestamp': '2026-01-22T22:40:09.462833', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-424272542', 'name': 'tap354f33c9-4c', 'instance_id': '254e913f-3968-436b-afcc-e51c2350b232', 'instance_type': 'm1.nano', 'host': '7664df65f86778e502733a7277d64519cbe11a1b06fcb5fb3254321f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:e4:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap354f33c9-4c'}, 'message_id': '4e785d7c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.86663543, 'message_signature': 'ae27cfc8d1c2657117d277f9d2024acbc360e646d773dda60eed93ef12a8cb74'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000084-e6a1471a-80f0-43ff-95e0-b865b6134ab6-tap22e0ead7-6f', 'timestamp': '2026-01-22T22:40:09.462833', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-80512993', 'name': 'tap22e0ead7-6f', 'instance_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:c2:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap22e0ead7-6f'}, 'message_id': '4e786862-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5246.870337062, 'message_signature': '496f7cfd85f427ab7c4b82026bf8a8644a2557a51dcb0c5e4f4fb81f11131ee8'}]}, 'timestamp': '2026-01-22 22:40:09.463836', '_unique_id': 'c59a8a44b19147c7b0b400a7bf7aada9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:40:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:40:09.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.405 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.411 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.417 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.423 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.428 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.433 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.445 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:09 compute-0 rsyslogd[1008]: message too long (8192) with configured size 8096, begin of message is: 2026-01-22 22:40:09.461 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 22 22:40:10 compute-0 ovn_controller[94850]: 2026-01-22T22:40:10Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:c2:50 10.100.0.10
Jan 22 22:40:10 compute-0 ovn_controller[94850]: 2026-01-22T22:40:10Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:c2:50 10.100.0.10
Jan 22 22:40:10 compute-0 nova_compute[182725]: 2026-01-22 22:40:10.275 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:11 compute-0 nova_compute[182725]: 2026-01-22 22:40:11.258 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "e4a5cd94-28e4-4031-ae49-2527cbacc939" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:11 compute-0 nova_compute[182725]: 2026-01-22 22:40:11.259 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:11 compute-0 nova_compute[182725]: 2026-01-22 22:40:11.259 182729 INFO nova.compute.manager [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Shelving
Jan 22 22:40:11 compute-0 nova_compute[182725]: 2026-01-22 22:40:11.281 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121596.2798252, dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:40:11 compute-0 nova_compute[182725]: 2026-01-22 22:40:11.282 182729 INFO nova.compute.manager [-] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] VM Stopped (Lifecycle Event)
Jan 22 22:40:11 compute-0 nova_compute[182725]: 2026-01-22 22:40:11.308 182729 DEBUG nova.compute.manager [None req-4632d83d-8b3a-45df-9911-cd272b63b9c4 - - - - - -] [instance: dd983e80-ec5c-4a5a-b0fa-a7e3b66dfb3f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:40:11 compute-0 nova_compute[182725]: 2026-01-22 22:40:11.310 182729 DEBUG nova.virt.libvirt.driver [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:40:12 compute-0 nova_compute[182725]: 2026-01-22 22:40:12.271 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:12.448 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:12.449 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:12.451 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:13 compute-0 podman[230706]: 2026-01-22 22:40:13.154581266 +0000 UTC m=+0.071501935 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:40:13 compute-0 podman[230704]: 2026-01-22 22:40:13.15431998 +0000 UTC m=+0.088212472 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:40:13 compute-0 podman[230705]: 2026-01-22 22:40:13.16752909 +0000 UTC m=+0.089753332 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:40:13 compute-0 kernel: tapd1f07ed0-8f (unregistering): left promiscuous mode
Jan 22 22:40:13 compute-0 NetworkManager[54954]: <info>  [1769121613.4779] device (tapd1f07ed0-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:40:13 compute-0 nova_compute[182725]: 2026-01-22 22:40:13.485 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:13 compute-0 ovn_controller[94850]: 2026-01-22T22:40:13Z|00524|binding|INFO|Releasing lport d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb from this chassis (sb_readonly=0)
Jan 22 22:40:13 compute-0 ovn_controller[94850]: 2026-01-22T22:40:13Z|00525|binding|INFO|Setting lport d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb down in Southbound
Jan 22 22:40:13 compute-0 ovn_controller[94850]: 2026-01-22T22:40:13Z|00526|binding|INFO|Removing iface tapd1f07ed0-8f ovn-installed in OVS
Jan 22 22:40:13 compute-0 nova_compute[182725]: 2026-01-22 22:40:13.488 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:13 compute-0 nova_compute[182725]: 2026-01-22 22:40:13.511 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:13 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000081.scope: Deactivated successfully.
Jan 22 22:40:13 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000081.scope: Consumed 14.646s CPU time.
Jan 22 22:40:13 compute-0 systemd-machined[154006]: Machine qemu-57-instance-00000081 terminated.
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.195 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.316 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:50:f1 10.100.0.10'], port_security=['fa:16:3e:19:50:f1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e4a5cd94-28e4-4031-ae49-2527cbacc939', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b9a45c4-3bd4-4f5f-b26b-5b1ab95bdd58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.318 104215 INFO neutron.agent.ovn.metadata.agent [-] Port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 unbound from our chassis
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.319 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.327 182729 INFO nova.virt.libvirt.driver [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance shutdown successfully after 3 seconds.
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.333 182729 INFO nova.virt.libvirt.driver [-] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance destroyed successfully.
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.333 182729 DEBUG nova.objects.instance [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'numa_topology' on Instance uuid e4a5cd94-28e4-4031-ae49-2527cbacc939 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.335 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1995e72d-fdd6-4315-b7a8-2426479972e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.364 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b605a3db-98af-4fcd-99ca-62502d506592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.367 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[db698ef2-75fa-40b7-942f-3fb274b452ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.393 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[57ca792d-3acf-4bad-9fcc-a091889515ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.412 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[08f93f54-8bb2-4bdc-8766-35a22f120425]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512189, 'reachable_time': 16169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230799, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.428 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c350d002-a559-41f6-b8fe-8caace662903]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap84d8b010-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512202, 'tstamp': 512202}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230800, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap84d8b010-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512205, 'tstamp': 512205}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230800, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.429 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.431 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.435 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.435 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84d8b010-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.436 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.436 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84d8b010-d0, col_values=(('external_ids', {'iface-id': '8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:14 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:14.436 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.710 182729 INFO nova.virt.libvirt.driver [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Beginning cold snapshot process
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.963 182729 DEBUG nova.privsep.utils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:40:14 compute-0 nova_compute[182725]: 2026-01-22 22:40:14.964 182729 DEBUG oslo_concurrency.processutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk /var/lib/nova/instances/snapshots/tmp_ksjomcm/78fef8f6fd9c4da9890dbd361f9b0ed3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.252 182729 DEBUG oslo_concurrency.processutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939/disk /var/lib/nova/instances/snapshots/tmp_ksjomcm/78fef8f6fd9c4da9890dbd361f9b0ed3" returned: 0 in 0.288s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.253 182729 INFO nova.virt.libvirt.driver [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Snapshot extracted, beginning image upload
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.672 182729 DEBUG nova.compute.manager [req-3f7746c0-0d62-433a-94d8-68107b320754 req-cf5fd449-0414-4171-8f33-6936e09e2fde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received event network-vif-unplugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.673 182729 DEBUG oslo_concurrency.lockutils [req-3f7746c0-0d62-433a-94d8-68107b320754 req-cf5fd449-0414-4171-8f33-6936e09e2fde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.673 182729 DEBUG oslo_concurrency.lockutils [req-3f7746c0-0d62-433a-94d8-68107b320754 req-cf5fd449-0414-4171-8f33-6936e09e2fde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.674 182729 DEBUG oslo_concurrency.lockutils [req-3f7746c0-0d62-433a-94d8-68107b320754 req-cf5fd449-0414-4171-8f33-6936e09e2fde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.674 182729 DEBUG nova.compute.manager [req-3f7746c0-0d62-433a-94d8-68107b320754 req-cf5fd449-0414-4171-8f33-6936e09e2fde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] No waiting events found dispatching network-vif-unplugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:15 compute-0 nova_compute[182725]: 2026-01-22 22:40:15.674 182729 WARNING nova.compute.manager [req-3f7746c0-0d62-433a-94d8-68107b320754 req-cf5fd449-0414-4171-8f33-6936e09e2fde 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received unexpected event network-vif-unplugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb for instance with vm_state active and task_state shelving_image_uploading.
Jan 22 22:40:16 compute-0 nova_compute[182725]: 2026-01-22 22:40:16.454 182729 INFO nova.compute.manager [None req-6356d769-d2b2-49e1-a8d4-6d5388601dbd b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Get console output
Jan 22 22:40:16 compute-0 nova_compute[182725]: 2026-01-22 22:40:16.460 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.241 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121602.2402596, 9273e299-fee4-42e3-a2d9-f8b355cc5cfe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.242 182729 INFO nova.compute.manager [-] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] VM Stopped (Lifecycle Event)
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.274 182729 DEBUG nova.compute.manager [None req-6c7141e3-f561-47f9-8057-d4a30e6549db - - - - - -] [instance: 9273e299-fee4-42e3-a2d9-f8b355cc5cfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.275 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.849 182729 DEBUG nova.compute.manager [req-4a9efbb2-12a7-4e5d-9899-d1318a8860a4 req-746a1fde-6c31-44d0-b243-b4515112c228 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received event network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.850 182729 DEBUG oslo_concurrency.lockutils [req-4a9efbb2-12a7-4e5d-9899-d1318a8860a4 req-746a1fde-6c31-44d0-b243-b4515112c228 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.850 182729 DEBUG oslo_concurrency.lockutils [req-4a9efbb2-12a7-4e5d-9899-d1318a8860a4 req-746a1fde-6c31-44d0-b243-b4515112c228 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.850 182729 DEBUG oslo_concurrency.lockutils [req-4a9efbb2-12a7-4e5d-9899-d1318a8860a4 req-746a1fde-6c31-44d0-b243-b4515112c228 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.850 182729 DEBUG nova.compute.manager [req-4a9efbb2-12a7-4e5d-9899-d1318a8860a4 req-746a1fde-6c31-44d0-b243-b4515112c228 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] No waiting events found dispatching network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:17 compute-0 nova_compute[182725]: 2026-01-22 22:40:17.850 182729 WARNING nova.compute.manager [req-4a9efbb2-12a7-4e5d-9899-d1318a8860a4 req-746a1fde-6c31-44d0-b243-b4515112c228 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received unexpected event network-vif-plugged-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb for instance with vm_state active and task_state shelving_image_uploading.
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.360 182729 INFO nova.virt.libvirt.driver [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Snapshot image upload complete
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.361 182729 DEBUG nova.compute.manager [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.461 182729 INFO nova.compute.manager [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Shelve offloading
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.481 182729 INFO nova.virt.libvirt.driver [-] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance destroyed successfully.
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.481 182729 DEBUG nova.compute.manager [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.483 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.483 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.483 182729 DEBUG nova.network.neutron [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:40:18 compute-0 nova_compute[182725]: 2026-01-22 22:40:18.521 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:19 compute-0 nova_compute[182725]: 2026-01-22 22:40:19.198 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:19 compute-0 nova_compute[182725]: 2026-01-22 22:40:19.935 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:20 compute-0 nova_compute[182725]: 2026-01-22 22:40:20.343 182729 DEBUG nova.network.neutron [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updating instance_info_cache with network_info: [{"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:20 compute-0 nova_compute[182725]: 2026-01-22 22:40:20.360 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:22 compute-0 nova_compute[182725]: 2026-01-22 22:40:22.278 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:22 compute-0 nova_compute[182725]: 2026-01-22 22:40:22.483 182729 DEBUG oslo_concurrency.lockutils [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "interface-e6a1471a-80f0-43ff-95e0-b865b6134ab6-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:22 compute-0 nova_compute[182725]: 2026-01-22 22:40:22.484 182729 DEBUG oslo_concurrency.lockutils [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-e6a1471a-80f0-43ff-95e0-b865b6134ab6-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:22 compute-0 nova_compute[182725]: 2026-01-22 22:40:22.484 182729 DEBUG nova.objects.instance [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'flavor' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.699 182729 DEBUG nova.objects.instance [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_requests' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.719 182729 DEBUG nova.network.neutron [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.818 182729 INFO nova.virt.libvirt.driver [-] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Instance destroyed successfully.
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.819 182729 DEBUG nova.objects.instance [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'resources' on Instance uuid e4a5cd94-28e4-4031-ae49-2527cbacc939 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.836 182729 DEBUG nova.virt.libvirt.vif [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1471061737',display_name='tempest-ServerActionsTestOtherB-server-1471061737',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1471061737',id=129,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFkqkcnPVXoYCjIgI5bw5OA6/5nym1rzZZURq62sf76ZpC5y2dgqZ39wG5JFuphS0Mujaf51N2ioOXSv8BTIWm028Sgb05TqNV6DDbykFp1jT1uEcdV7QMSeYi3Dtxoog==',key_name='tempest-keypair-483211252',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-m8y1mmcu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member',shelved_at='2026-01-22T22:40:18.361188',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='b25e9103-0160-45e7-9887-7053028e2f2d'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:40:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8b15fdf3e23640a2b9579790941bb346',uuid=e4a5cd94-28e4-4031-ae49-2527cbacc939,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.837 182729 DEBUG nova.network.os_vif_util [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.838 182729 DEBUG nova.network.os_vif_util [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:50:f1,bridge_name='br-int',has_traffic_filtering=True,id=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1f07ed0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.839 182729 DEBUG os_vif [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:50:f1,bridge_name='br-int',has_traffic_filtering=True,id=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1f07ed0-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.841 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.842 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f07ed0-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.844 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.845 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.849 182729 INFO os_vif [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:50:f1,bridge_name='br-int',has_traffic_filtering=True,id=d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1f07ed0-8f')
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.850 182729 INFO nova.virt.libvirt.driver [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Deleting instance files /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939_del
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.865 182729 INFO nova.virt.libvirt.driver [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Deletion of /var/lib/nova/instances/e4a5cd94-28e4-4031-ae49-2527cbacc939_del complete
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.918 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.919 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.920 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.920 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.924 182729 DEBUG nova.compute.manager [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Received event network-changed-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.925 182729 DEBUG nova.compute.manager [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Refreshing instance network info cache due to event network-changed-d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.925 182729 DEBUG oslo_concurrency.lockutils [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.926 182729 DEBUG oslo_concurrency.lockutils [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:23 compute-0 nova_compute[182725]: 2026-01-22 22:40:23.926 182729 DEBUG nova.network.neutron [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Refreshing network info cache for port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.011 182729 INFO nova.scheduler.client.report [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Deleted allocations for instance e4a5cd94-28e4-4031-ae49-2527cbacc939
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.022 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Error from libvirt while getting description of instance-00000081: [Error Code 42] Domain not found: no domain with matching uuid 'e4a5cd94-28e4-4031-ae49-2527cbacc939' (instance-00000081): libvirt.libvirtError: Domain not found: no domain with matching uuid 'e4a5cd94-28e4-4031-ae49-2527cbacc939' (instance-00000081)
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.026 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.086 182729 DEBUG nova.policy [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.102 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.103 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.124 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.125 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.160 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.169 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.232 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.233 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.255 182729 DEBUG nova.compute.provider_tree [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.273 182729 DEBUG nova.scheduler.client.report [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.291 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.305 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.402 182729 DEBUG oslo_concurrency.lockutils [None req-c52919ef-93c3-43da-a2ba-253d0c515ee1 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "e4a5cd94-28e4-4031-ae49-2527cbacc939" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 13.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.496 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.498 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5291MB free_disk=73.27618026733398GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.499 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.499 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.611 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 254e913f-3968-436b-afcc-e51c2350b232 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.611 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance e6a1471a-80f0-43ff-95e0-b865b6134ab6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.611 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.612 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.736 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.748 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.771 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:40:24 compute-0 nova_compute[182725]: 2026-01-22 22:40:24.771 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:25 compute-0 nova_compute[182725]: 2026-01-22 22:40:25.141 182729 DEBUG nova.network.neutron [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Successfully created port: 658b3afc-9804-4041-afa0-856ac448b68e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:40:25 compute-0 nova_compute[182725]: 2026-01-22 22:40:25.771 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:25 compute-0 nova_compute[182725]: 2026-01-22 22:40:25.772 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:40:25 compute-0 nova_compute[182725]: 2026-01-22 22:40:25.772 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.150 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.151 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.151 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.151 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 254e913f-3968-436b-afcc-e51c2350b232 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.250 182729 DEBUG nova.network.neutron [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updated VIF entry in instance network info cache for port d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.251 182729 DEBUG nova.network.neutron [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Updating instance_info_cache with network_info: [{"id": "d1f07ed0-8f5e-407c-9e9b-74ba338aa8eb", "address": "fa:16:3e:19:50:f1", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": null, "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapd1f07ed0-8f", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.302 182729 DEBUG oslo_concurrency.lockutils [req-048cdecd-94ab-45e2-b455-2f73b1341332 req-8b0a702e-af17-4cdd-bb68-0ce71e789f18 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e4a5cd94-28e4-4031-ae49-2527cbacc939" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.570 182729 DEBUG nova.network.neutron [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Successfully updated port: 658b3afc-9804-4041-afa0-856ac448b68e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.588 182729 DEBUG oslo_concurrency.lockutils [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.589 182729 DEBUG oslo_concurrency.lockutils [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:26 compute-0 nova_compute[182725]: 2026-01-22 22:40:26.589 182729 DEBUG nova.network.neutron [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:40:27 compute-0 nova_compute[182725]: 2026-01-22 22:40:27.518 182729 DEBUG nova.compute.manager [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-changed-658b3afc-9804-4041-afa0-856ac448b68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:27 compute-0 nova_compute[182725]: 2026-01-22 22:40:27.518 182729 DEBUG nova.compute.manager [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing instance network info cache due to event network-changed-658b3afc-9804-4041-afa0-856ac448b68e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:40:27 compute-0 nova_compute[182725]: 2026-01-22 22:40:27.519 182729 DEBUG oslo_concurrency.lockutils [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:28 compute-0 podman[230825]: 2026-01-22 22:40:28.139757202 +0000 UTC m=+0.073235390 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:40:28 compute-0 nova_compute[182725]: 2026-01-22 22:40:28.763 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121613.7621324, e4a5cd94-28e4-4031-ae49-2527cbacc939 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:40:28 compute-0 nova_compute[182725]: 2026-01-22 22:40:28.764 182729 INFO nova.compute.manager [-] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] VM Stopped (Lifecycle Event)
Jan 22 22:40:28 compute-0 nova_compute[182725]: 2026-01-22 22:40:28.808 182729 DEBUG nova.compute.manager [None req-04caee98-5124-4f48-a35d-40e31484da51 - - - - - -] [instance: e4a5cd94-28e4-4031-ae49-2527cbacc939] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:40:28 compute-0 nova_compute[182725]: 2026-01-22 22:40:28.845 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:29 compute-0 nova_compute[182725]: 2026-01-22 22:40:29.202 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:29 compute-0 nova_compute[182725]: 2026-01-22 22:40:29.406 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updating instance_info_cache with network_info: [{"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:29 compute-0 nova_compute[182725]: 2026-01-22 22:40:29.446 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-254e913f-3968-436b-afcc-e51c2350b232" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:29 compute-0 nova_compute[182725]: 2026-01-22 22:40:29.447 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:40:29 compute-0 nova_compute[182725]: 2026-01-22 22:40:29.448 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:29 compute-0 nova_compute[182725]: 2026-01-22 22:40:29.448 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:29 compute-0 nova_compute[182725]: 2026-01-22 22:40:29.449 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.384 182729 DEBUG nova.network.neutron [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.420 182729 DEBUG oslo_concurrency.lockutils [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.421 182729 DEBUG oslo_concurrency.lockutils [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.421 182729 DEBUG nova.network.neutron [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing network info cache for port 658b3afc-9804-4041-afa0-856ac448b68e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.425 182729 DEBUG nova.virt.libvirt.vif [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:57Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.425 182729 DEBUG nova.network.os_vif_util [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.427 182729 DEBUG nova.network.os_vif_util [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.427 182729 DEBUG os_vif [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.428 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.429 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.429 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.432 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.433 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap658b3afc-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.433 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap658b3afc-98, col_values=(('external_ids', {'iface-id': '658b3afc-9804-4041-afa0-856ac448b68e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:5d:3c', 'vm-uuid': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:31 compute-0 NetworkManager[54954]: <info>  [1769121631.4365] manager: (tap658b3afc-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.437 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.449 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.451 182729 INFO os_vif [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98')
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.453 182729 DEBUG nova.virt.libvirt.vif [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:57Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.454 182729 DEBUG nova.network.os_vif_util [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.455 182729 DEBUG nova.network.os_vif_util [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.458 182729 DEBUG nova.virt.libvirt.guest [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] attach device xml: <interface type="ethernet">
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:97:5d:3c"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <target dev="tap658b3afc-98"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]: </interface>
Jan 22 22:40:31 compute-0 nova_compute[182725]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 22:40:31 compute-0 kernel: tap658b3afc-98: entered promiscuous mode
Jan 22 22:40:31 compute-0 NetworkManager[54954]: <info>  [1769121631.4757] manager: (tap658b3afc-98): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.475 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 ovn_controller[94850]: 2026-01-22T22:40:31Z|00527|binding|INFO|Claiming lport 658b3afc-9804-4041-afa0-856ac448b68e for this chassis.
Jan 22 22:40:31 compute-0 ovn_controller[94850]: 2026-01-22T22:40:31Z|00528|binding|INFO|658b3afc-9804-4041-afa0-856ac448b68e: Claiming fa:16:3e:97:5d:3c 10.100.0.28
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.479 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.507 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:5d:3c 10.100.0.28'], port_security=['fa:16:3e:97:5d:3c 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '28858496-9de7-4d51-8064-4ba52669cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95543040-16cd-4dd4-8033-7cc6ece0df9f, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=658b3afc-9804-4041-afa0-856ac448b68e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.510 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 658b3afc-9804-4041-afa0-856ac448b68e in datapath e2e4afc8-807c-4b60-859b-b08af1bb8476 bound to our chassis
Jan 22 22:40:31 compute-0 systemd-udevd[230850]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.514 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e2e4afc8-807c-4b60-859b-b08af1bb8476
Jan 22 22:40:31 compute-0 NetworkManager[54954]: <info>  [1769121631.5282] device (tap658b3afc-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:40:31 compute-0 NetworkManager[54954]: <info>  [1769121631.5293] device (tap658b3afc-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.531 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9e713667-2881-4458-aca3-96ca288aed26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.534 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape2e4afc8-81 in ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.537 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape2e4afc8-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.537 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[68433ad7-d528-4efc-a61b-d8041a49c509]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.539 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e48e1223-3d98-4cde-b949-77e52b9eaaf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.542 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 ovn_controller[94850]: 2026-01-22T22:40:31Z|00529|binding|INFO|Setting lport 658b3afc-9804-4041-afa0-856ac448b68e ovn-installed in OVS
Jan 22 22:40:31 compute-0 ovn_controller[94850]: 2026-01-22T22:40:31Z|00530|binding|INFO|Setting lport 658b3afc-9804-4041-afa0-856ac448b68e up in Southbound
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.546 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.559 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[636ad22a-a720-4d06-950f-417cc4c3c298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.594 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6fca14e6-e3d4-4d0a-b279-ce5bdbf0e762]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.625 182729 DEBUG nova.virt.libvirt.driver [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.626 182729 DEBUG nova.virt.libvirt.driver [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.627 182729 DEBUG nova.virt.libvirt.driver [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:52:c2:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.627 182729 DEBUG nova.virt.libvirt.driver [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:97:5d:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.633 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[098f55e1-e680-4dd0-af3e-038d5eacbd4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 NetworkManager[54954]: <info>  [1769121631.6435] manager: (tape2e4afc8-80): new Veth device (/org/freedesktop/NetworkManager/Devices/247)
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.642 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[534def9b-12c2-4ec4-b163-c540719ddf84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 systemd-udevd[230854]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.656 182729 DEBUG nova.virt.libvirt.guest [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:40:31</nova:creationTime>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:40:31 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     <nova:port uuid="658b3afc-9804-4041-afa0-856ac448b68e">
Jan 22 22:40:31 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 22 22:40:31 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:31 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:40:31 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:40:31 compute-0 nova_compute[182725]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.685 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[ab364346-e2cf-4a34-9423-b20b6287321c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.689 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[96e115a4-0c62-45f8-a682-6fc478bb88b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.692 182729 DEBUG oslo_concurrency.lockutils [None req-dcea110e-f1f3-4343-bc38-fc842b2ce2d2 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-e6a1471a-80f0-43ff-95e0-b865b6134ab6-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:31 compute-0 NetworkManager[54954]: <info>  [1769121631.7132] device (tape2e4afc8-80): carrier: link connected
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.720 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4a726d1d-3f5f-41d4-acea-281cc4c73b48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.740 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9adff6d4-528b-4077-8d7b-876732d9e310]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape2e4afc8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:0c:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526931, 'reachable_time': 28438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230878, 'error': None, 'target': 'ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.755 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[54df2909-3eff-4189-99d7-968dac91827d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:ca3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526931, 'tstamp': 526931}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230879, 'error': None, 'target': 'ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.773 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[46fe1b96-3da8-4e80-94b3-f1d0d961eb83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape2e4afc8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:0c:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526931, 'reachable_time': 28438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230880, 'error': None, 'target': 'ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.805 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8e0c0561-ac4b-45f1-85d7-f70ea3343b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.875 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b699c327-abdb-414f-adb5-b5174df5e389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.877 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape2e4afc8-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.877 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.878 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape2e4afc8-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:31 compute-0 kernel: tape2e4afc8-80: entered promiscuous mode
Jan 22 22:40:31 compute-0 NetworkManager[54954]: <info>  [1769121631.8807] manager: (tape2e4afc8-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.881 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.886 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape2e4afc8-80, col_values=(('external_ids', {'iface-id': '3d98d97b-c997-4a03-9928-a5c26c7a3fb8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.887 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 ovn_controller[94850]: 2026-01-22T22:40:31Z|00531|binding|INFO|Releasing lport 3d98d97b-c997-4a03-9928-a5c26c7a3fb8 from this chassis (sb_readonly=0)
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.889 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.889 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e2e4afc8-807c-4b60-859b-b08af1bb8476.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e2e4afc8-807c-4b60-859b-b08af1bb8476.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.890 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6000992c-a38f-4781-a470-c128d69fe0db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.891 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e2e4afc8-807c-4b60-859b-b08af1bb8476
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e2e4afc8-807c-4b60-859b-b08af1bb8476.pid.haproxy
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e2e4afc8-807c-4b60-859b-b08af1bb8476
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:40:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:31.892 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'env', 'PROCESS_TAG=haproxy-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e2e4afc8-807c-4b60-859b-b08af1bb8476.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.901 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.976 182729 DEBUG nova.compute.manager [req-0da56332-70b8-4c66-8668-41a5bb82c2ea req-4ad4b7ef-c552-4814-8716-4e400db6be82 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.976 182729 DEBUG oslo_concurrency.lockutils [req-0da56332-70b8-4c66-8668-41a5bb82c2ea req-4ad4b7ef-c552-4814-8716-4e400db6be82 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.977 182729 DEBUG oslo_concurrency.lockutils [req-0da56332-70b8-4c66-8668-41a5bb82c2ea req-4ad4b7ef-c552-4814-8716-4e400db6be82 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.977 182729 DEBUG oslo_concurrency.lockutils [req-0da56332-70b8-4c66-8668-41a5bb82c2ea req-4ad4b7ef-c552-4814-8716-4e400db6be82 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.977 182729 DEBUG nova.compute.manager [req-0da56332-70b8-4c66-8668-41a5bb82c2ea req-4ad4b7ef-c552-4814-8716-4e400db6be82 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] No waiting events found dispatching network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:31 compute-0 nova_compute[182725]: 2026-01-22 22:40:31.977 182729 WARNING nova.compute.manager [req-0da56332-70b8-4c66-8668-41a5bb82c2ea req-4ad4b7ef-c552-4814-8716-4e400db6be82 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received unexpected event network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e for instance with vm_state active and task_state None.
Jan 22 22:40:32 compute-0 podman[230912]: 2026-01-22 22:40:32.290646112 +0000 UTC m=+0.058234835 container create 299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:40:32 compute-0 systemd[1]: Started libpod-conmon-299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce.scope.
Jan 22 22:40:32 compute-0 podman[230912]: 2026-01-22 22:40:32.260947411 +0000 UTC m=+0.028536124 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:40:32 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:40:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/789dce5b7b2b699bbc94a24aff03355ca3bdc2c47d5015a00d2ff607cb6f340f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:40:32 compute-0 podman[230912]: 2026-01-22 22:40:32.413613193 +0000 UTC m=+0.181201896 container init 299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:40:32 compute-0 podman[230912]: 2026-01-22 22:40:32.421015588 +0000 UTC m=+0.188604271 container start 299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:40:32 compute-0 podman[230930]: 2026-01-22 22:40:32.452651578 +0000 UTC m=+0.081602409 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 22:40:32 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [NOTICE]   (230955) : New worker (230971) forked
Jan 22 22:40:32 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [NOTICE]   (230955) : Loading success.
Jan 22 22:40:32 compute-0 podman[230941]: 2026-01-22 22:40:32.513006725 +0000 UTC m=+0.099507126 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:40:32 compute-0 nova_compute[182725]: 2026-01-22 22:40:32.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:32 compute-0 nova_compute[182725]: 2026-01-22 22:40:32.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.769 182729 DEBUG nova.network.neutron [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updated VIF entry in instance network info cache for port 658b3afc-9804-4041-afa0-856ac448b68e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.770 182729 DEBUG nova.network.neutron [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.814 182729 DEBUG oslo_concurrency.lockutils [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "interface-e6a1471a-80f0-43ff-95e0-b865b6134ab6-658b3afc-9804-4041-afa0-856ac448b68e" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.815 182729 DEBUG oslo_concurrency.lockutils [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-e6a1471a-80f0-43ff-95e0-b865b6134ab6-658b3afc-9804-4041-afa0-856ac448b68e" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.818 182729 DEBUG oslo_concurrency.lockutils [req-6b6eeea0-8abc-4e21-b06a-1aa09a3fdd1a req-c4916a92-c8af-471a-89a1-bb47b8477d3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.832 182729 DEBUG nova.objects.instance [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'flavor' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.870 182729 DEBUG nova.virt.libvirt.vif [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:57Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.871 182729 DEBUG nova.network.os_vif_util [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.873 182729 DEBUG nova.network.os_vif_util [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.877 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.880 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.882 182729 DEBUG nova.virt.libvirt.driver [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Attempting to detach device tap658b3afc-98 from instance e6a1471a-80f0-43ff-95e0-b865b6134ab6 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.883 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] detach device xml: <interface type="ethernet">
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:97:5d:3c"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <target dev="tap658b3afc-98"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]: </interface>
Jan 22 22:40:33 compute-0 nova_compute[182725]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.892 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.895 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface>not found in domain: <domain type='kvm' id='60'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <name>instance-00000084</name>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <uuid>e6a1471a-80f0-43ff-95e0-b865b6134ab6</uuid>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:40:31</nova:creationTime>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <nova:port uuid="658b3afc-9804-4041-afa0-856ac448b68e">
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:40:33 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <memory unit='KiB'>131072</memory>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <vcpu placement='static'>1</vcpu>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <resource>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <partition>/machine</partition>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </resource>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <sysinfo type='smbios'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <system>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <entry name='manufacturer'>RDO</entry>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <entry name='serial'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <entry name='uuid'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <entry name='family'>Virtual Machine</entry>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </system>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <os>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <boot dev='hd'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <smbios mode='sysinfo'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </os>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <features>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <vmcoreinfo state='on'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </features>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <model fallback='forbid'>Nehalem</model>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <feature policy='require' name='x2apic'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <feature policy='require' name='hypervisor'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <feature policy='require' name='vme'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <clock offset='utc'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <timer name='hpet' present='no'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <on_poweroff>destroy</on_poweroff>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <on_reboot>restart</on_reboot>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <on_crash>destroy</on_crash>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <disk type='file' device='disk'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk' index='2'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <backingStore type='file' index='3'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:         <format type='raw'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:         <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:         <backingStore/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       </backingStore>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target dev='vda' bus='virtio'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='virtio-disk0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <disk type='file' device='cdrom'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config' index='1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <backingStore/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target dev='sda' bus='sata'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <readonly/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='sata0-0-0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pcie.0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='1' port='0x10'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='2' port='0x11'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.2'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='3' port='0x12'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.3'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='4' port='0x13'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.4'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='5' port='0x14'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.5'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='6' port='0x15'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.6'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='7' port='0x16'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.7'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='8' port='0x17'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.8'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='9' port='0x18'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.9'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='10' port='0x19'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.10'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='11' port='0x1a'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.11'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='12' port='0x1b'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.12'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='13' port='0x1c'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.13'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='14' port='0x1d'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.14'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='15' port='0x1e'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.15'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='16' port='0x1f'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.16'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='17' port='0x20'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.17'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='18' port='0x21'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.18'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='19' port='0x22'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.19'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='20' port='0x23'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.20'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='21' port='0x24'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.21'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='22' port='0x25'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.22'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='23' port='0x26'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.23'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='24' port='0x27'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.24'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target chassis='25' port='0x28'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.25'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model name='pcie-pci-bridge'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='pci.26'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='usb'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <controller type='sata' index='0'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='ide'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <interface type='ethernet'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <mac address='fa:16:3e:52:c2:50'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target dev='tap22e0ead7-6f'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model type='virtio'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <mtu size='1442'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='net0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <interface type='ethernet'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <mac address='fa:16:3e:97:5d:3c'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target dev='tap658b3afc-98'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model type='virtio'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <mtu size='1442'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='net1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <serial type='pty'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target type='isa-serial' port='0'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:         <model name='isa-serial'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       </target>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <target type='serial' port='0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </console>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <input type='tablet' bus='usb'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='input0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='usb' bus='0' port='1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <input type='mouse' bus='ps2'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='input1'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <input type='keyboard' bus='ps2'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='input2'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <listen type='address' address='::0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <audio id='1' type='none'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <video>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='video0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </video>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <watchdog model='itco' action='reset'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='watchdog0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </watchdog>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <memballoon model='virtio'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <stats period='10'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='balloon0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <rng model='virtio'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <backend model='random'>/dev/urandom</backend>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <alias name='rng0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <label>system_u:system_r:svirt_t:s0:c412,c871</label>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c412,c871</imagelabel>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <label>+107:+107</label>
Jan 22 22:40:33 compute-0 nova_compute[182725]:     <imagelabel>+107:+107</imagelabel>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:33 compute-0 nova_compute[182725]: </domain>
Jan 22 22:40:33 compute-0 nova_compute[182725]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.896 182729 INFO nova.virt.libvirt.driver [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully detached device tap658b3afc-98 from instance e6a1471a-80f0-43ff-95e0-b865b6134ab6 from the persistent domain config.
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.898 182729 DEBUG nova.virt.libvirt.driver [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] (1/8): Attempting to detach device tap658b3afc-98 with device alias net1 from instance e6a1471a-80f0-43ff-95e0-b865b6134ab6 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 22 22:40:33 compute-0 nova_compute[182725]: 2026-01-22 22:40:33.898 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] detach device xml: <interface type="ethernet">
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <mac address="fa:16:3e:97:5d:3c"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <model type="virtio"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <mtu size="1442"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]:   <target dev="tap658b3afc-98"/>
Jan 22 22:40:33 compute-0 nova_compute[182725]: </interface>
Jan 22 22:40:33 compute-0 nova_compute[182725]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 22 22:40:34 compute-0 kernel: tap658b3afc-98 (unregistering): left promiscuous mode
Jan 22 22:40:34 compute-0 NetworkManager[54954]: <info>  [1769121634.0242] device (tap658b3afc-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.034 182729 DEBUG nova.virt.libvirt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Received event <DeviceRemovedEvent: 1769121634.0343332, e6a1471a-80f0-43ff-95e0-b865b6134ab6 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.037 182729 DEBUG nova.virt.libvirt.driver [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Start waiting for the detach event from libvirt for device tap658b3afc-98 with device alias net1 for instance e6a1471a-80f0-43ff-95e0-b865b6134ab6 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.038 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 22:40:34 compute-0 ovn_controller[94850]: 2026-01-22T22:40:34Z|00532|binding|INFO|Releasing lport 658b3afc-9804-4041-afa0-856ac448b68e from this chassis (sb_readonly=0)
Jan 22 22:40:34 compute-0 ovn_controller[94850]: 2026-01-22T22:40:34Z|00533|binding|INFO|Setting lport 658b3afc-9804-4041-afa0-856ac448b68e down in Southbound
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.041 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 ovn_controller[94850]: 2026-01-22T22:40:34Z|00534|binding|INFO|Removing iface tap658b3afc-98 ovn-installed in OVS
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.045 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface>not found in domain: <domain type='kvm' id='60'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <name>instance-00000084</name>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <uuid>e6a1471a-80f0-43ff-95e0-b865b6134ab6</uuid>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:40:31</nova:creationTime>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:port uuid="658b3afc-9804-4041-afa0-856ac448b68e">
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:40:34 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <memory unit='KiB'>131072</memory>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <vcpu placement='static'>1</vcpu>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <resource>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <partition>/machine</partition>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </resource>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <sysinfo type='smbios'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <system>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <entry name='manufacturer'>RDO</entry>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <entry name='serial'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <entry name='uuid'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <entry name='family'>Virtual Machine</entry>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </system>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <os>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <boot dev='hd'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <smbios mode='sysinfo'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </os>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <features>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <vmcoreinfo state='on'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </features>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <model fallback='forbid'>Nehalem</model>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <feature policy='require' name='x2apic'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <feature policy='require' name='hypervisor'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <feature policy='require' name='vme'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <clock offset='utc'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <timer name='hpet' present='no'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <on_poweroff>destroy</on_poweroff>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <on_reboot>restart</on_reboot>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <on_crash>destroy</on_crash>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <disk type='file' device='disk'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk' index='2'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <backingStore type='file' index='3'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:         <format type='raw'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:         <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:         <backingStore/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       </backingStore>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target dev='vda' bus='virtio'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='virtio-disk0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <disk type='file' device='cdrom'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config' index='1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <backingStore/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target dev='sda' bus='sata'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <readonly/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='sata0-0-0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pcie.0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='1' port='0x10'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='2' port='0x11'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.2'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='3' port='0x12'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.3'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='4' port='0x13'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.4'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='5' port='0x14'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.5'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='6' port='0x15'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.6'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='7' port='0x16'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.7'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='8' port='0x17'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.8'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='9' port='0x18'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.9'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='10' port='0x19'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.10'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='11' port='0x1a'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.11'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='12' port='0x1b'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.12'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='13' port='0x1c'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.13'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='14' port='0x1d'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.14'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='15' port='0x1e'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.15'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='16' port='0x1f'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.16'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='17' port='0x20'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.17'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='18' port='0x21'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.18'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='19' port='0x22'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.19'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='20' port='0x23'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.20'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='21' port='0x24'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.21'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='22' port='0x25'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.22'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='23' port='0x26'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.23'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='24' port='0x27'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.24'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target chassis='25' port='0x28'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.25'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model name='pcie-pci-bridge'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='pci.26'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='usb'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <controller type='sata' index='0'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='ide'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <interface type='ethernet'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <mac address='fa:16:3e:52:c2:50'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target dev='tap22e0ead7-6f'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model type='virtio'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <mtu size='1442'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='net0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <serial type='pty'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target type='isa-serial' port='0'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:         <model name='isa-serial'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       </target>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <target type='serial' port='0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </console>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <input type='tablet' bus='usb'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='input0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='usb' bus='0' port='1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <input type='mouse' bus='ps2'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='input1'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <input type='keyboard' bus='ps2'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='input2'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <listen type='address' address='::0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <audio id='1' type='none'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <video>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='video0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </video>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <watchdog model='itco' action='reset'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='watchdog0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </watchdog>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <memballoon model='virtio'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <stats period='10'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='balloon0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <rng model='virtio'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <backend model='random'>/dev/urandom</backend>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <alias name='rng0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <label>system_u:system_r:svirt_t:s0:c412,c871</label>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c412,c871</imagelabel>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <label>+107:+107</label>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <imagelabel>+107:+107</imagelabel>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:34 compute-0 nova_compute[182725]: </domain>
Jan 22 22:40:34 compute-0 nova_compute[182725]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.046 182729 INFO nova.virt.libvirt.driver [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully detached device tap658b3afc-98 from instance e6a1471a-80f0-43ff-95e0-b865b6134ab6 from the live domain config.
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.047 182729 DEBUG nova.virt.libvirt.vif [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:57Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.048 182729 DEBUG nova.network.os_vif_util [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.049 182729 DEBUG nova.network.os_vif_util [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.050 182729 DEBUG os_vif [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.052 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.051 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:5d:3c 10.100.0.28'], port_security=['fa:16:3e:97:5d:3c 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '28858496-9de7-4d51-8064-4ba52669cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95543040-16cd-4dd4-8033-7cc6ece0df9f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=658b3afc-9804-4041-afa0-856ac448b68e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.053 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap658b3afc-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.054 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 658b3afc-9804-4041-afa0-856ac448b68e in datapath e2e4afc8-807c-4b60-859b-b08af1bb8476 unbound from our chassis
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.055 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.055 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.057 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.058 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e2e4afc8-807c-4b60-859b-b08af1bb8476, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.062 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5b477620-c42d-4014-826e-26c7524d0d1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.063 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476 namespace which is not needed anymore
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.071 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.080 182729 INFO os_vif [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98')
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.081 182729 DEBUG nova.virt.libvirt.guest [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:40:34</nova:creationTime>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:40:34 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:40:34 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:34 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:40:34 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:40:34 compute-0 nova_compute[182725]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.086 182729 DEBUG nova.compute.manager [req-5a8000fa-2e5c-4554-b602-d180631208b9 req-abfb73c0-5ce4-4048-a067-a83f0d4d0d55 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.087 182729 DEBUG oslo_concurrency.lockutils [req-5a8000fa-2e5c-4554-b602-d180631208b9 req-abfb73c0-5ce4-4048-a067-a83f0d4d0d55 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.087 182729 DEBUG oslo_concurrency.lockutils [req-5a8000fa-2e5c-4554-b602-d180631208b9 req-abfb73c0-5ce4-4048-a067-a83f0d4d0d55 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.088 182729 DEBUG oslo_concurrency.lockutils [req-5a8000fa-2e5c-4554-b602-d180631208b9 req-abfb73c0-5ce4-4048-a067-a83f0d4d0d55 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.088 182729 DEBUG nova.compute.manager [req-5a8000fa-2e5c-4554-b602-d180631208b9 req-abfb73c0-5ce4-4048-a067-a83f0d4d0d55 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] No waiting events found dispatching network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.089 182729 WARNING nova.compute.manager [req-5a8000fa-2e5c-4554-b602-d180631208b9 req-abfb73c0-5ce4-4048-a067-a83f0d4d0d55 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received unexpected event network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e for instance with vm_state active and task_state None.
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.204 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [NOTICE]   (230955) : haproxy version is 2.8.14-c23fe91
Jan 22 22:40:34 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [NOTICE]   (230955) : path to executable is /usr/sbin/haproxy
Jan 22 22:40:34 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [WARNING]  (230955) : Exiting Master process...
Jan 22 22:40:34 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [WARNING]  (230955) : Exiting Master process...
Jan 22 22:40:34 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [ALERT]    (230955) : Current worker (230971) exited with code 143 (Terminated)
Jan 22 22:40:34 compute-0 neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476[230928]: [WARNING]  (230955) : All workers exited. Exiting... (0)
Jan 22 22:40:34 compute-0 systemd[1]: libpod-299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce.scope: Deactivated successfully.
Jan 22 22:40:34 compute-0 podman[231008]: 2026-01-22 22:40:34.231878161 +0000 UTC m=+0.043268471 container died 299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 22:40:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce-userdata-shm.mount: Deactivated successfully.
Jan 22 22:40:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-789dce5b7b2b699bbc94a24aff03355ca3bdc2c47d5015a00d2ff607cb6f340f-merged.mount: Deactivated successfully.
Jan 22 22:40:34 compute-0 podman[231008]: 2026-01-22 22:40:34.271088891 +0000 UTC m=+0.082479191 container cleanup 299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 22:40:34 compute-0 systemd[1]: libpod-conmon-299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce.scope: Deactivated successfully.
Jan 22 22:40:34 compute-0 podman[231036]: 2026-01-22 22:40:34.326814102 +0000 UTC m=+0.036075072 container remove 299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.332 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[33762d61-35d7-4a28-acb6-b00e4571e196]: (4, ('Thu Jan 22 10:40:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476 (299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce)\n299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce\nThu Jan 22 10:40:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476 (299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce)\n299c7fe3f46ad93d4d466a11c4a643b08792b2790e45268aea1f0884845f00ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.335 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3f48fb45-2214-433d-b47e-19453ecb1989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.337 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape2e4afc8-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.338 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 kernel: tape2e4afc8-80: left promiscuous mode
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.351 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.357 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb91598-f953-4390-91c0-7bf826fec158]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.377 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[53350fe5-d5d4-4240-b55a-2068aec3a441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.378 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6a3ee6-2227-46be-9118-9dde556dbea9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.397 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3bdea1-eb25-43d2-b7a9-4e5a177f4220]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526922, 'reachable_time': 37036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231051, 'error': None, 'target': 'ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.399 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e2e4afc8-807c-4b60-859b-b08af1bb8476 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.399 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2f4bba-7649-41dc-ac5f-af2c4c6285ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:34 compute-0 systemd[1]: run-netns-ovnmeta\x2de2e4afc8\x2d807c\x2d4b60\x2d859b\x2db08af1bb8476.mount: Deactivated successfully.
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.858 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.858 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:40:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:34.860 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.912 182729 DEBUG oslo_concurrency.lockutils [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.913 182729 DEBUG oslo_concurrency.lockutils [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:34 compute-0 nova_compute[182725]: 2026-01-22 22:40:34.913 182729 DEBUG nova.network.neutron [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.812 182729 DEBUG nova.compute.manager [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-deleted-658b3afc-9804-4041-afa0-856ac448b68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.813 182729 INFO nova.compute.manager [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Neutron deleted interface 658b3afc-9804-4041-afa0-856ac448b68e; detaching it from the instance and deleting it from the info cache
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.813 182729 DEBUG nova.network.neutron [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:35.863 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.867 182729 DEBUG nova.objects.instance [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lazy-loading 'system_metadata' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.900 182729 DEBUG nova.objects.instance [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lazy-loading 'flavor' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.921 182729 DEBUG nova.virt.libvirt.vif [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:57Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.922 182729 DEBUG nova.network.os_vif_util [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Converting VIF {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.923 182729 DEBUG nova.network.os_vif_util [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.926 182729 DEBUG nova.virt.libvirt.guest [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.931 182729 DEBUG nova.virt.libvirt.guest [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface>not found in domain: <domain type='kvm' id='60'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <name>instance-00000084</name>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <uuid>e6a1471a-80f0-43ff-95e0-b865b6134ab6</uuid>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:40:34</nova:creationTime>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:40:35 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <memory unit='KiB'>131072</memory>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <vcpu placement='static'>1</vcpu>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <resource>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <partition>/machine</partition>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </resource>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <sysinfo type='smbios'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <system>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='manufacturer'>RDO</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='serial'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='uuid'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='family'>Virtual Machine</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </system>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <os>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <boot dev='hd'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <smbios mode='sysinfo'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </os>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <features>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <vmcoreinfo state='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </features>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <model fallback='forbid'>Nehalem</model>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <feature policy='require' name='x2apic'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <feature policy='require' name='hypervisor'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <feature policy='require' name='vme'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <clock offset='utc'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <timer name='hpet' present='no'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <on_poweroff>destroy</on_poweroff>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <on_reboot>restart</on_reboot>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <on_crash>destroy</on_crash>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <disk type='file' device='disk'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk' index='2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <backingStore type='file' index='3'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <format type='raw'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <backingStore/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       </backingStore>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target dev='vda' bus='virtio'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='virtio-disk0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <disk type='file' device='cdrom'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config' index='1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <backingStore/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target dev='sda' bus='sata'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <readonly/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='sata0-0-0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pcie.0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='1' port='0x10'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='2' port='0x11'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='3' port='0x12'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='4' port='0x13'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='5' port='0x14'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='6' port='0x15'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='7' port='0x16'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='8' port='0x17'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.8'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='9' port='0x18'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.9'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='10' port='0x19'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.10'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='11' port='0x1a'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.11'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='12' port='0x1b'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.12'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='13' port='0x1c'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.13'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='14' port='0x1d'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.14'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='15' port='0x1e'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.15'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='16' port='0x1f'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.16'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='17' port='0x20'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.17'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='18' port='0x21'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.18'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='19' port='0x22'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.19'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='20' port='0x23'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.20'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='21' port='0x24'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.21'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='22' port='0x25'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.22'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='23' port='0x26'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.23'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='24' port='0x27'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.24'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='25' port='0x28'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.25'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-pci-bridge'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.26'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='usb'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='sata' index='0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='ide'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <interface type='ethernet'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <mac address='fa:16:3e:52:c2:50'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target dev='tap22e0ead7-6f'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model type='virtio'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <mtu size='1442'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='net0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <serial type='pty'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target type='isa-serial' port='0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <model name='isa-serial'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       </target>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target type='serial' port='0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </console>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <input type='tablet' bus='usb'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='input0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='usb' bus='0' port='1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <input type='mouse' bus='ps2'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='input1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <input type='keyboard' bus='ps2'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='input2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <listen type='address' address='::0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <audio id='1' type='none'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <video>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='video0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </video>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <watchdog model='itco' action='reset'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='watchdog0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </watchdog>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <memballoon model='virtio'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <stats period='10'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='balloon0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <rng model='virtio'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <backend model='random'>/dev/urandom</backend>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='rng0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <label>system_u:system_r:svirt_t:s0:c412,c871</label>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c412,c871</imagelabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <label>+107:+107</label>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <imagelabel>+107:+107</imagelabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]: </domain>
Jan 22 22:40:35 compute-0 nova_compute[182725]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.932 182729 DEBUG nova.virt.libvirt.guest [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.936 182729 DEBUG nova.virt.libvirt.guest [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:97:5d:3c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap658b3afc-98"/></interface>not found in domain: <domain type='kvm' id='60'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <name>instance-00000084</name>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <uuid>e6a1471a-80f0-43ff-95e0-b865b6134ab6</uuid>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:40:34</nova:creationTime>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:40:35 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <memory unit='KiB'>131072</memory>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <vcpu placement='static'>1</vcpu>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <resource>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <partition>/machine</partition>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </resource>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <sysinfo type='smbios'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <system>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='manufacturer'>RDO</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='product'>OpenStack Compute</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='serial'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='uuid'>e6a1471a-80f0-43ff-95e0-b865b6134ab6</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <entry name='family'>Virtual Machine</entry>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </system>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <os>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <boot dev='hd'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <smbios mode='sysinfo'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </os>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <features>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <vmcoreinfo state='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </features>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <cpu mode='custom' match='exact' check='full'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <model fallback='forbid'>Nehalem</model>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <feature policy='require' name='x2apic'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <feature policy='require' name='hypervisor'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <feature policy='require' name='vme'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <clock offset='utc'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <timer name='pit' tickpolicy='delay'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <timer name='hpet' present='no'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <on_poweroff>destroy</on_poweroff>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <on_reboot>restart</on_reboot>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <on_crash>destroy</on_crash>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <disk type='file' device='disk'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk' index='2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <backingStore type='file' index='3'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <format type='raw'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <backingStore/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       </backingStore>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target dev='vda' bus='virtio'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='virtio-disk0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <disk type='file' device='cdrom'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <driver name='qemu' type='raw' cache='none'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/disk.config' index='1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <backingStore/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target dev='sda' bus='sata'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <readonly/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='sata0-0-0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='0' model='pcie-root'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pcie.0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='1' port='0x10'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='2' port='0x11'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='3' port='0x12'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='4' port='0x13'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='5' port='0x14'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='6' port='0x15'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='7' port='0x16'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='8' port='0x17'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.8'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='9' port='0x18'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.9'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='10' port='0x19'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.10'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='11' port='0x1a'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.11'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='12' port='0x1b'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.12'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='13' port='0x1c'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.13'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='14' port='0x1d'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.14'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='15' port='0x1e'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.15'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='16' port='0x1f'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.16'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='17' port='0x20'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.17'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='18' port='0x21'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.18'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='19' port='0x22'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.19'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='20' port='0x23'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.20'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='21' port='0x24'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.21'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='22' port='0x25'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.22'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='23' port='0x26'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.23'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='24' port='0x27'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.24'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-root-port'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target chassis='25' port='0x28'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.25'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model name='pcie-pci-bridge'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='pci.26'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='usb'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <controller type='sata' index='0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='ide'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </controller>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <interface type='ethernet'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <mac address='fa:16:3e:52:c2:50'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target dev='tap22e0ead7-6f'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model type='virtio'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <driver name='vhost' rx_queue_size='512'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <mtu size='1442'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='net0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <serial type='pty'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target type='isa-serial' port='0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:         <model name='isa-serial'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       </target>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <console type='pty' tty='/dev/pts/0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <source path='/dev/pts/0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <log file='/var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6/console.log' append='off'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <target type='serial' port='0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='serial0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </console>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <input type='tablet' bus='usb'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='input0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='usb' bus='0' port='1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <input type='mouse' bus='ps2'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='input1'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <input type='keyboard' bus='ps2'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='input2'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </input>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <graphics type='vnc' port='5903' autoport='yes' listen='::0'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <listen type='address' address='::0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </graphics>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <audio id='1' type='none'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <video>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <model type='virtio' heads='1' primary='yes'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='video0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </video>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <watchdog model='itco' action='reset'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='watchdog0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </watchdog>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <memballoon model='virtio'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <stats period='10'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='balloon0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <rng model='virtio'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <backend model='random'>/dev/urandom</backend>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <alias name='rng0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <label>system_u:system_r:svirt_t:s0:c412,c871</label>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c412,c871</imagelabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <label>+107:+107</label>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <imagelabel>+107:+107</imagelabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </seclabel>
Jan 22 22:40:35 compute-0 nova_compute[182725]: </domain>
Jan 22 22:40:35 compute-0 nova_compute[182725]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.936 182729 WARNING nova.virt.libvirt.driver [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Detaching interface fa:16:3e:97:5d:3c failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap658b3afc-98' not found.
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.937 182729 DEBUG nova.virt.libvirt.vif [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:57Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.937 182729 DEBUG nova.network.os_vif_util [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Converting VIF {"id": "658b3afc-9804-4041-afa0-856ac448b68e", "address": "fa:16:3e:97:5d:3c", "network": {"id": "e2e4afc8-807c-4b60-859b-b08af1bb8476", "bridge": "br-int", "label": "tempest-network-smoke--123525089", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap658b3afc-98", "ovs_interfaceid": "658b3afc-9804-4041-afa0-856ac448b68e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.938 182729 DEBUG nova.network.os_vif_util [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.938 182729 DEBUG os_vif [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.940 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.940 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap658b3afc-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.941 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.944 182729 INFO os_vif [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:5d:3c,bridge_name='br-int',has_traffic_filtering=True,id=658b3afc-9804-4041-afa0-856ac448b68e,network=Network(e2e4afc8-807c-4b60-859b-b08af1bb8476),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap658b3afc-98')
Jan 22 22:40:35 compute-0 nova_compute[182725]: 2026-01-22 22:40:35.944 182729 DEBUG nova.virt.libvirt.guest [req-ab82bd50-729d-4488-b87c-b99b780ff32c req-cf58ca9b-bd55-4b43-9b3a-206dfd7c06b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:name>tempest-TestNetworkBasicOps-server-80512993</nova:name>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:creationTime>2026-01-22 22:40:35</nova:creationTime>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:flavor name="m1.nano">
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:memory>128</nova:memory>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:disk>1</nova:disk>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:swap>0</nova:swap>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:vcpus>1</nova:vcpus>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:flavor>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:owner>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:owner>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   <nova:ports>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     <nova:port uuid="22e0ead7-6f30-4530-8c7a-18ca9aeeab12">
Jan 22 22:40:35 compute-0 nova_compute[182725]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 22:40:35 compute-0 nova_compute[182725]:     </nova:port>
Jan 22 22:40:35 compute-0 nova_compute[182725]:   </nova:ports>
Jan 22 22:40:35 compute-0 nova_compute[182725]: </nova:instance>
Jan 22 22:40:35 compute-0 nova_compute[182725]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.217 182729 DEBUG nova.compute.manager [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-unplugged-658b3afc-9804-4041-afa0-856ac448b68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.218 182729 DEBUG oslo_concurrency.lockutils [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.218 182729 DEBUG oslo_concurrency.lockutils [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.218 182729 DEBUG oslo_concurrency.lockutils [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.218 182729 DEBUG nova.compute.manager [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] No waiting events found dispatching network-vif-unplugged-658b3afc-9804-4041-afa0-856ac448b68e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.218 182729 WARNING nova.compute.manager [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received unexpected event network-vif-unplugged-658b3afc-9804-4041-afa0-856ac448b68e for instance with vm_state active and task_state None.
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.219 182729 DEBUG nova.compute.manager [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.219 182729 DEBUG oslo_concurrency.lockutils [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.219 182729 DEBUG oslo_concurrency.lockutils [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.219 182729 DEBUG oslo_concurrency.lockutils [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.219 182729 DEBUG nova.compute.manager [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] No waiting events found dispatching network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:36 compute-0 nova_compute[182725]: 2026-01-22 22:40:36.219 182729 WARNING nova.compute.manager [req-ab93afcf-fd73-4bdd-9e9c-b2facf8d8699 req-5db279eb-60ea-4f08-9807-a95a802fa772 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received unexpected event network-vif-plugged-658b3afc-9804-4041-afa0-856ac448b68e for instance with vm_state active and task_state None.
Jan 22 22:40:37 compute-0 nova_compute[182725]: 2026-01-22 22:40:37.031 182729 INFO nova.network.neutron [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Port 658b3afc-9804-4041-afa0-856ac448b68e from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 22:40:37 compute-0 nova_compute[182725]: 2026-01-22 22:40:37.032 182729 DEBUG nova.network.neutron [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:37 compute-0 nova_compute[182725]: 2026-01-22 22:40:37.066 182729 DEBUG oslo_concurrency.lockutils [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:37 compute-0 nova_compute[182725]: 2026-01-22 22:40:37.098 182729 DEBUG oslo_concurrency.lockutils [None req-9977117e-f41b-4cbe-89f4-283134138ef4 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-e6a1471a-80f0-43ff-95e0-b865b6134ab6-658b3afc-9804-4041-afa0-856ac448b68e" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:37 compute-0 ovn_controller[94850]: 2026-01-22T22:40:37Z|00535|binding|INFO|Releasing lport cf3a3ad2-5afe-400f-b31e-2a0edf61e11b from this chassis (sb_readonly=0)
Jan 22 22:40:37 compute-0 ovn_controller[94850]: 2026-01-22T22:40:37Z|00536|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 22:40:37 compute-0 nova_compute[182725]: 2026-01-22 22:40:37.598 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.427 182729 DEBUG nova.compute.manager [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-changed-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.428 182729 DEBUG nova.compute.manager [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing instance network info cache due to event network-changed-22e0ead7-6f30-4530-8c7a-18ca9aeeab12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.429 182729 DEBUG oslo_concurrency.lockutils [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.429 182729 DEBUG oslo_concurrency.lockutils [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.429 182729 DEBUG nova.network.neutron [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Refreshing network info cache for port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.582 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.582 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.583 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.583 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.584 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.603 182729 INFO nova.compute.manager [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Terminating instance
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.637 182729 DEBUG nova.compute.manager [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:40:38 compute-0 kernel: tap22e0ead7-6f (unregistering): left promiscuous mode
Jan 22 22:40:38 compute-0 NetworkManager[54954]: <info>  [1769121638.6777] device (tap22e0ead7-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.685 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:38 compute-0 ovn_controller[94850]: 2026-01-22T22:40:38Z|00537|binding|INFO|Releasing lport 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 from this chassis (sb_readonly=0)
Jan 22 22:40:38 compute-0 ovn_controller[94850]: 2026-01-22T22:40:38Z|00538|binding|INFO|Setting lport 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 down in Southbound
Jan 22 22:40:38 compute-0 ovn_controller[94850]: 2026-01-22T22:40:38Z|00539|binding|INFO|Removing iface tap22e0ead7-6f ovn-installed in OVS
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.689 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.713 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:38.750 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:c2:50 10.100.0.10'], port_security=['fa:16:3e:52:c2:50 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e6a1471a-80f0-43ff-95e0-b865b6134ab6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbb96457-41f3-4931-8421-59ae568f6512', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7df5b530-2858-4af9-8ee2-0c5e2e8071be', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b987ad2-bd3d-4a80-a6eb-b548d3af0bc7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=22e0ead7-6f30-4530-8c7a-18ca9aeeab12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:40:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:38.752 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12 in datapath fbb96457-41f3-4931-8421-59ae568f6512 unbound from our chassis
Jan 22 22:40:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:38.754 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbb96457-41f3-4931-8421-59ae568f6512, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:40:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:38.755 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6f42e878-b811-48f1-80ff-265ad7e279b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:38.756 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512 namespace which is not needed anymore
Jan 22 22:40:38 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000084.scope: Deactivated successfully.
Jan 22 22:40:38 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000084.scope: Consumed 15.182s CPU time.
Jan 22 22:40:38 compute-0 systemd-machined[154006]: Machine qemu-60-instance-00000084 terminated.
Jan 22 22:40:38 compute-0 neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512[230457]: [NOTICE]   (230461) : haproxy version is 2.8.14-c23fe91
Jan 22 22:40:38 compute-0 neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512[230457]: [NOTICE]   (230461) : path to executable is /usr/sbin/haproxy
Jan 22 22:40:38 compute-0 neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512[230457]: [WARNING]  (230461) : Exiting Master process...
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.915 182729 INFO nova.virt.libvirt.driver [-] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Instance destroyed successfully.
Jan 22 22:40:38 compute-0 neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512[230457]: [ALERT]    (230461) : Current worker (230463) exited with code 143 (Terminated)
Jan 22 22:40:38 compute-0 neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512[230457]: [WARNING]  (230461) : All workers exited. Exiting... (0)
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.916 182729 DEBUG nova.objects.instance [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid e6a1471a-80f0-43ff-95e0-b865b6134ab6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:38 compute-0 systemd[1]: libpod-318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78.scope: Deactivated successfully.
Jan 22 22:40:38 compute-0 podman[231077]: 2026-01-22 22:40:38.923979056 +0000 UTC m=+0.055707612 container died 318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.931 182729 DEBUG nova.virt.libvirt.vif [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-80512993',display_name='tempest-TestNetworkBasicOps-server-80512993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-80512993',id=132,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC29/D+s+Rvk+bYe+vZrk1isHEdG1SpBFpnsAB0pAYt7J0AILvXVpTT+QxvFyT9KKy64asLlX9zTC7eJGQ6ofaJfUMSlCJMbhevfp7zufdRpycVtrv6cfZy0fi2T4qUCiQ==',key_name='tempest-TestNetworkBasicOps-538303151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-khlm7v33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:57Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=e6a1471a-80f0-43ff-95e0-b865b6134ab6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.931 182729 DEBUG nova.network.os_vif_util [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.932 182729 DEBUG nova.network.os_vif_util [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:c2:50,bridge_name='br-int',has_traffic_filtering=True,id=22e0ead7-6f30-4530-8c7a-18ca9aeeab12,network=Network(fbb96457-41f3-4931-8421-59ae568f6512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e0ead7-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.932 182729 DEBUG os_vif [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:c2:50,bridge_name='br-int',has_traffic_filtering=True,id=22e0ead7-6f30-4530-8c7a-18ca9aeeab12,network=Network(fbb96457-41f3-4931-8421-59ae568f6512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e0ead7-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.934 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.935 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e0ead7-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.936 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.941 182729 INFO os_vif [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:c2:50,bridge_name='br-int',has_traffic_filtering=True,id=22e0ead7-6f30-4530-8c7a-18ca9aeeab12,network=Network(fbb96457-41f3-4931-8421-59ae568f6512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e0ead7-6f')
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.941 182729 INFO nova.virt.libvirt.driver [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Deleting instance files /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6_del
Jan 22 22:40:38 compute-0 nova_compute[182725]: 2026-01-22 22:40:38.942 182729 INFO nova.virt.libvirt.driver [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Deletion of /var/lib/nova/instances/e6a1471a-80f0-43ff-95e0-b865b6134ab6_del complete
Jan 22 22:40:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78-userdata-shm.mount: Deactivated successfully.
Jan 22 22:40:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-66fbdb007b9fdaabd9e148d492367a94ba33deefd4161989bb74ad36c8592944-merged.mount: Deactivated successfully.
Jan 22 22:40:38 compute-0 podman[231077]: 2026-01-22 22:40:38.966502808 +0000 UTC m=+0.098231334 container cleanup 318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 22:40:38 compute-0 systemd[1]: libpod-conmon-318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78.scope: Deactivated successfully.
Jan 22 22:40:39 compute-0 podman[231122]: 2026-01-22 22:40:39.020136157 +0000 UTC m=+0.034497252 container remove 318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.024 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[97f461ed-42f4-4467-ad82-54f6bb9f8453]: (4, ('Thu Jan 22 10:40:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512 (318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78)\n318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78\nThu Jan 22 10:40:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512 (318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78)\n318cb2edf1e7bc3a782d03014c6d1abea879341383a762ffc1b60d16ada5ee78\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.026 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[37b48970-7bb6-4e99-90de-4c3319c6030b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.027 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbb96457-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.028 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:39 compute-0 kernel: tapfbb96457-40: left promiscuous mode
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.033 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94344d5c-ea5b-4000-80df-30ca960f1cb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.042 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.055 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5d20552f-8cf5-4a1a-8993-ba295b9d45a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.056 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5c36d173-15ce-4757-b1fc-34f5a3c68337]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.075 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[71721c40-c394-467d-9e9e-0cbdbc967f56]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523426, 'reachable_time': 19047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231137, 'error': None, 'target': 'ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.076 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbb96457-41f3-4931-8421-59ae568f6512 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:40:39 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:39.076 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9ce26780-b187-4807-b2ef-35f5002ce29f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:39 compute-0 systemd[1]: run-netns-ovnmeta\x2dfbb96457\x2d41f3\x2d4931\x2d8421\x2d59ae568f6512.mount: Deactivated successfully.
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.081 182729 INFO nova.compute.manager [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.081 182729 DEBUG oslo.service.loopingcall [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.082 182729 DEBUG nova.compute.manager [-] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.082 182729 DEBUG nova.network.neutron [-] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.478 182729 DEBUG nova.compute.manager [req-28b0786e-df2e-409f-801c-2a537388ae1d req-492f098e-bc06-4748-b673-066835b09781 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-unplugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.479 182729 DEBUG oslo_concurrency.lockutils [req-28b0786e-df2e-409f-801c-2a537388ae1d req-492f098e-bc06-4748-b673-066835b09781 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.480 182729 DEBUG oslo_concurrency.lockutils [req-28b0786e-df2e-409f-801c-2a537388ae1d req-492f098e-bc06-4748-b673-066835b09781 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.480 182729 DEBUG oslo_concurrency.lockutils [req-28b0786e-df2e-409f-801c-2a537388ae1d req-492f098e-bc06-4748-b673-066835b09781 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.480 182729 DEBUG nova.compute.manager [req-28b0786e-df2e-409f-801c-2a537388ae1d req-492f098e-bc06-4748-b673-066835b09781 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] No waiting events found dispatching network-vif-unplugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:39 compute-0 nova_compute[182725]: 2026-01-22 22:40:39.481 182729 DEBUG nova.compute.manager [req-28b0786e-df2e-409f-801c-2a537388ae1d req-492f098e-bc06-4748-b673-066835b09781 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-unplugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:40:40 compute-0 nova_compute[182725]: 2026-01-22 22:40:40.769 182729 DEBUG nova.network.neutron [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updated VIF entry in instance network info cache for port 22e0ead7-6f30-4530-8c7a-18ca9aeeab12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:40:40 compute-0 nova_compute[182725]: 2026-01-22 22:40:40.771 182729 DEBUG nova.network.neutron [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [{"id": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "address": "fa:16:3e:52:c2:50", "network": {"id": "fbb96457-41f3-4931-8421-59ae568f6512", "bridge": "br-int", "label": "tempest-network-smoke--1229393138", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e0ead7-6f", "ovs_interfaceid": "22e0ead7-6f30-4530-8c7a-18ca9aeeab12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:40 compute-0 nova_compute[182725]: 2026-01-22 22:40:40.798 182729 DEBUG oslo_concurrency.lockutils [req-c8c1fb0c-9bf4-452a-a079-6f7e3b573d53 req-12b1d987-86e2-4e0b-b1ae-d9b290e77160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e6a1471a-80f0-43ff-95e0-b865b6134ab6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.364 182729 DEBUG nova.network.neutron [-] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.397 182729 INFO nova.compute.manager [-] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Took 2.32 seconds to deallocate network for instance.
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.482 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.482 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.578 182729 DEBUG nova.compute.provider_tree [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.582 182729 DEBUG nova.compute.manager [req-a8664078-6f7a-4258-9c98-9bcb17a787fe req-9525a412-d68d-41c1-bead-cc5e4c460ee9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.582 182729 DEBUG oslo_concurrency.lockutils [req-a8664078-6f7a-4258-9c98-9bcb17a787fe req-9525a412-d68d-41c1-bead-cc5e4c460ee9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.582 182729 DEBUG oslo_concurrency.lockutils [req-a8664078-6f7a-4258-9c98-9bcb17a787fe req-9525a412-d68d-41c1-bead-cc5e4c460ee9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.583 182729 DEBUG oslo_concurrency.lockutils [req-a8664078-6f7a-4258-9c98-9bcb17a787fe req-9525a412-d68d-41c1-bead-cc5e4c460ee9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.583 182729 DEBUG nova.compute.manager [req-a8664078-6f7a-4258-9c98-9bcb17a787fe req-9525a412-d68d-41c1-bead-cc5e4c460ee9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] No waiting events found dispatching network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.583 182729 WARNING nova.compute.manager [req-a8664078-6f7a-4258-9c98-9bcb17a787fe req-9525a412-d68d-41c1-bead-cc5e4c460ee9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received unexpected event network-vif-plugged-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 for instance with vm_state deleted and task_state None.
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.583 182729 DEBUG nova.compute.manager [req-a8664078-6f7a-4258-9c98-9bcb17a787fe req-9525a412-d68d-41c1-bead-cc5e4c460ee9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Received event network-vif-deleted-22e0ead7-6f30-4530-8c7a-18ca9aeeab12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.595 182729 DEBUG nova.scheduler.client.report [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.616 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.648 182729 INFO nova.scheduler.client.report [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance e6a1471a-80f0-43ff-95e0-b865b6134ab6
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.730 182729 DEBUG oslo_concurrency.lockutils [None req-cf67e8e3-8bc9-4952-9f1c-42954ad3d7ba b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "e6a1471a-80f0-43ff-95e0-b865b6134ab6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.824 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.851 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Triggering sync for uuid 254e913f-3968-436b-afcc-e51c2350b232 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.851 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.851 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "254e913f-3968-436b-afcc-e51c2350b232" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:41 compute-0 nova_compute[182725]: 2026-01-22 22:40:41.874 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "254e913f-3968-436b-afcc-e51c2350b232" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:43 compute-0 nova_compute[182725]: 2026-01-22 22:40:43.937 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:44 compute-0 podman[231138]: 2026-01-22 22:40:44.154290474 +0000 UTC m=+0.074560453 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:40:44 compute-0 podman[231140]: 2026-01-22 22:40:44.154731075 +0000 UTC m=+0.062570913 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:40:44 compute-0 podman[231139]: 2026-01-22 22:40:44.16571669 +0000 UTC m=+0.080483591 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 22:40:44 compute-0 nova_compute[182725]: 2026-01-22 22:40:44.209 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:46 compute-0 ovn_controller[94850]: 2026-01-22T22:40:46Z|00540|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 22:40:46 compute-0 nova_compute[182725]: 2026-01-22 22:40:46.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:48 compute-0 nova_compute[182725]: 2026-01-22 22:40:48.939 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:49 compute-0 nova_compute[182725]: 2026-01-22 22:40:49.211 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:53 compute-0 nova_compute[182725]: 2026-01-22 22:40:53.919 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121638.9104707, e6a1471a-80f0-43ff-95e0-b865b6134ab6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:40:53 compute-0 nova_compute[182725]: 2026-01-22 22:40:53.921 182729 INFO nova.compute.manager [-] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] VM Stopped (Lifecycle Event)
Jan 22 22:40:53 compute-0 nova_compute[182725]: 2026-01-22 22:40:53.942 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:53 compute-0 nova_compute[182725]: 2026-01-22 22:40:53.944 182729 DEBUG nova.compute.manager [None req-e9580ffd-7c44-4520-9bb9-6fd80ab22afe - - - - - -] [instance: e6a1471a-80f0-43ff-95e0-b865b6134ab6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:40:54 compute-0 nova_compute[182725]: 2026-01-22 22:40:54.216 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:54 compute-0 sshd-session[231205]: Connection closed by 142.93.3.75 port 47732
Jan 22 22:40:54 compute-0 nova_compute[182725]: 2026-01-22 22:40:54.946 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.603 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.603 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.604 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.604 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.604 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.616 182729 INFO nova.compute.manager [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Terminating instance
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.626 182729 DEBUG nova.compute.manager [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:40:58 compute-0 kernel: tap354f33c9-4c (unregistering): left promiscuous mode
Jan 22 22:40:58 compute-0 NetworkManager[54954]: <info>  [1769121658.6601] device (tap354f33c9-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:40:58 compute-0 ovn_controller[94850]: 2026-01-22T22:40:58Z|00541|binding|INFO|Releasing lport 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 from this chassis (sb_readonly=0)
Jan 22 22:40:58 compute-0 ovn_controller[94850]: 2026-01-22T22:40:58Z|00542|binding|INFO|Setting lport 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 down in Southbound
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.669 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 ovn_controller[94850]: 2026-01-22T22:40:58Z|00543|binding|INFO|Removing iface tap354f33c9-4c ovn-installed in OVS
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.672 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.681 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:e4:29 10.100.0.8'], port_security=['fa:16:3e:7d:e4:29 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '254e913f-3968-436b-afcc-e51c2350b232', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2d993940-8666-43d7-8759-418fc1311e0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=354f33c9-4c2e-4d16-bcc6-072c571ea8a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.683 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 354f33c9-4c2e-4d16-bcc6-072c571ea8a3 in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 unbound from our chassis
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.685 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 84d8b010-d968-4df4-bedf-0c350ae42113, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.687 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbbb759-d8dd-4a58-bc4a-e829eeea8665]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.688 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 namespace which is not needed anymore
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.695 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Jan 22 22:40:58 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007b.scope: Consumed 19.654s CPU time.
Jan 22 22:40:58 compute-0 systemd-machined[154006]: Machine qemu-56-instance-0000007b terminated.
Jan 22 22:40:58 compute-0 podman[231207]: 2026-01-22 22:40:58.746546128 +0000 UTC m=+0.068441571 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:40:58 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [NOTICE]   (229086) : haproxy version is 2.8.14-c23fe91
Jan 22 22:40:58 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [NOTICE]   (229086) : path to executable is /usr/sbin/haproxy
Jan 22 22:40:58 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [WARNING]  (229086) : Exiting Master process...
Jan 22 22:40:58 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [WARNING]  (229086) : Exiting Master process...
Jan 22 22:40:58 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [ALERT]    (229086) : Current worker (229088) exited with code 143 (Terminated)
Jan 22 22:40:58 compute-0 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229082]: [WARNING]  (229086) : All workers exited. Exiting... (0)
Jan 22 22:40:58 compute-0 systemd[1]: libpod-159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c.scope: Deactivated successfully.
Jan 22 22:40:58 compute-0 podman[231252]: 2026-01-22 22:40:58.846616487 +0000 UTC m=+0.053443286 container died 159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:40:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c-userdata-shm.mount: Deactivated successfully.
Jan 22 22:40:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-b0607957d6c5a59b155906644247d694d50bf3738a08e6d7f4feea2dc2a5c10f-merged.mount: Deactivated successfully.
Jan 22 22:40:58 compute-0 podman[231252]: 2026-01-22 22:40:58.889073037 +0000 UTC m=+0.095899836 container cleanup 159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.898 182729 INFO nova.virt.libvirt.driver [-] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Instance destroyed successfully.
Jan 22 22:40:58 compute-0 systemd[1]: libpod-conmon-159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c.scope: Deactivated successfully.
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.898 182729 DEBUG nova.objects.instance [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'resources' on Instance uuid 254e913f-3968-436b-afcc-e51c2350b232 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.925 182729 DEBUG nova.virt.libvirt.vif [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:37:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-424272542',display_name='tempest-ServerActionsTestOtherB-server-424272542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-424272542',id=123,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:38:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-5vfbzi0x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:38:04Z,user_data=None,user_id='8b15fdf3e23640a2b9579790941bb346',uuid=254e913f-3968-436b-afcc-e51c2350b232,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.926 182729 DEBUG nova.network.os_vif_util [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "address": "fa:16:3e:7d:e4:29", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap354f33c9-4c", "ovs_interfaceid": "354f33c9-4c2e-4d16-bcc6-072c571ea8a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.926 182729 DEBUG nova.network.os_vif_util [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:e4:29,bridge_name='br-int',has_traffic_filtering=True,id=354f33c9-4c2e-4d16-bcc6-072c571ea8a3,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap354f33c9-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.927 182729 DEBUG os_vif [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:e4:29,bridge_name='br-int',has_traffic_filtering=True,id=354f33c9-4c2e-4d16-bcc6-072c571ea8a3,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap354f33c9-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.929 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.929 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354f33c9-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.931 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.936 182729 INFO os_vif [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:e4:29,bridge_name='br-int',has_traffic_filtering=True,id=354f33c9-4c2e-4d16-bcc6-072c571ea8a3,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap354f33c9-4c')
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.936 182729 INFO nova.virt.libvirt.driver [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Deleting instance files /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232_del
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.937 182729 INFO nova.virt.libvirt.driver [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Deletion of /var/lib/nova/instances/254e913f-3968-436b-afcc-e51c2350b232_del complete
Jan 22 22:40:58 compute-0 podman[231300]: 2026-01-22 22:40:58.951938897 +0000 UTC m=+0.040403760 container remove 159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.956 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[44716234-9ab9-416a-94d2-450155520ed2]: (4, ('Thu Jan 22 10:40:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 (159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c)\n159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c\nThu Jan 22 10:40:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 (159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c)\n159dc994ef423f7a9607b40263071f1e69a77d70b338df874da8d27db656220c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.958 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4ab9c6-d55f-4ea3-b763-64eaaf099d7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.958 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.960 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 kernel: tap84d8b010-d0: left promiscuous mode
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.962 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.964 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[772664f5-4bd0-4d7b-88ad-c96686aa533d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.972 182729 DEBUG nova.compute.manager [req-b97b12eb-cc87-4898-a78a-5c32f442a978 req-ae5ad705-e372-4cd1-8618-f58c3cef057b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received event network-vif-unplugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.973 182729 DEBUG oslo_concurrency.lockutils [req-b97b12eb-cc87-4898-a78a-5c32f442a978 req-ae5ad705-e372-4cd1-8618-f58c3cef057b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.973 182729 DEBUG oslo_concurrency.lockutils [req-b97b12eb-cc87-4898-a78a-5c32f442a978 req-ae5ad705-e372-4cd1-8618-f58c3cef057b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.974 182729 DEBUG oslo_concurrency.lockutils [req-b97b12eb-cc87-4898-a78a-5c32f442a978 req-ae5ad705-e372-4cd1-8618-f58c3cef057b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.974 182729 DEBUG nova.compute.manager [req-b97b12eb-cc87-4898-a78a-5c32f442a978 req-ae5ad705-e372-4cd1-8618-f58c3cef057b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] No waiting events found dispatching network-vif-unplugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.975 182729 DEBUG nova.compute.manager [req-b97b12eb-cc87-4898-a78a-5c32f442a978 req-ae5ad705-e372-4cd1-8618-f58c3cef057b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received event network-vif-unplugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:40:58 compute-0 nova_compute[182725]: 2026-01-22 22:40:58.975 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.982 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e18f00df-6bf3-41f8-8b89-3baf0bb0e785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:58.983 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[11bff975-a1a6-48ef-ad94-b47c96a54e09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:59.000 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[113fe1df-1bf9-4e13-acdb-4229c91cb3d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512180, 'reachable_time': 36272, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231314, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:59.003 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:40:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:40:59.003 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[3c09ab96-7cb7-45a1-b4a0-4b3653d84512]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:40:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d84d8b010\x2dd968\x2d4df4\x2dbedf\x2d0c350ae42113.mount: Deactivated successfully.
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.012 182729 INFO nova.compute.manager [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.012 182729 DEBUG oslo.service.loopingcall [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.013 182729 DEBUG nova.compute.manager [-] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.013 182729 DEBUG nova.network.neutron [-] [instance: 254e913f-3968-436b-afcc-e51c2350b232] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.222 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.670 182729 DEBUG nova.network.neutron [-] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.701 182729 INFO nova.compute.manager [-] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Took 0.69 seconds to deallocate network for instance.
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.801 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.801 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.882 182729 DEBUG nova.compute.provider_tree [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.906 182729 DEBUG nova.scheduler.client.report [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.941 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:40:59 compute-0 nova_compute[182725]: 2026-01-22 22:40:59.979 182729 INFO nova.scheduler.client.report [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Deleted allocations for instance 254e913f-3968-436b-afcc-e51c2350b232
Jan 22 22:41:00 compute-0 nova_compute[182725]: 2026-01-22 22:41:00.101 182729 DEBUG oslo_concurrency.lockutils [None req-020aefd0-892a-4166-a5eb-95e931e0d099 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:00 compute-0 nova_compute[182725]: 2026-01-22 22:41:00.298 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:01 compute-0 nova_compute[182725]: 2026-01-22 22:41:01.186 182729 DEBUG nova.compute.manager [req-821856be-e5a2-4112-b7d4-1f697b41053d req-b628c99f-7b1b-4909-9f5d-699da85cd2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received event network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:01 compute-0 nova_compute[182725]: 2026-01-22 22:41:01.187 182729 DEBUG oslo_concurrency.lockutils [req-821856be-e5a2-4112-b7d4-1f697b41053d req-b628c99f-7b1b-4909-9f5d-699da85cd2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "254e913f-3968-436b-afcc-e51c2350b232-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:01 compute-0 nova_compute[182725]: 2026-01-22 22:41:01.187 182729 DEBUG oslo_concurrency.lockutils [req-821856be-e5a2-4112-b7d4-1f697b41053d req-b628c99f-7b1b-4909-9f5d-699da85cd2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:01 compute-0 nova_compute[182725]: 2026-01-22 22:41:01.188 182729 DEBUG oslo_concurrency.lockutils [req-821856be-e5a2-4112-b7d4-1f697b41053d req-b628c99f-7b1b-4909-9f5d-699da85cd2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "254e913f-3968-436b-afcc-e51c2350b232-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:01 compute-0 nova_compute[182725]: 2026-01-22 22:41:01.188 182729 DEBUG nova.compute.manager [req-821856be-e5a2-4112-b7d4-1f697b41053d req-b628c99f-7b1b-4909-9f5d-699da85cd2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] No waiting events found dispatching network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:41:01 compute-0 nova_compute[182725]: 2026-01-22 22:41:01.189 182729 WARNING nova.compute.manager [req-821856be-e5a2-4112-b7d4-1f697b41053d req-b628c99f-7b1b-4909-9f5d-699da85cd2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received unexpected event network-vif-plugged-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 for instance with vm_state deleted and task_state None.
Jan 22 22:41:01 compute-0 nova_compute[182725]: 2026-01-22 22:41:01.189 182729 DEBUG nova.compute.manager [req-821856be-e5a2-4112-b7d4-1f697b41053d req-b628c99f-7b1b-4909-9f5d-699da85cd2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Received event network-vif-deleted-354f33c9-4c2e-4d16-bcc6-072c571ea8a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:03 compute-0 podman[231316]: 2026-01-22 22:41:03.139804781 +0000 UTC m=+0.067757843 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 22 22:41:03 compute-0 podman[231315]: 2026-01-22 22:41:03.184023594 +0000 UTC m=+0.105915035 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 22:41:03 compute-0 nova_compute[182725]: 2026-01-22 22:41:03.932 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:04 compute-0 nova_compute[182725]: 2026-01-22 22:41:04.224 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:06 compute-0 nova_compute[182725]: 2026-01-22 22:41:06.039 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:06 compute-0 nova_compute[182725]: 2026-01-22 22:41:06.219 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.096 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.096 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.124 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.232 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.233 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.238 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.238 182729 INFO nova.compute.claims [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.347 182729 DEBUG nova.compute.provider_tree [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.360 182729 DEBUG nova.scheduler.client.report [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.382 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.383 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.446 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.446 182729 DEBUG nova.network.neutron [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.470 182729 INFO nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.489 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.607 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.608 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.608 182729 INFO nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Creating image(s)
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.609 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.609 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.610 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.622 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.678 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.680 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.681 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.692 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.713 182729 DEBUG nova.policy [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.748 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.748 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.785 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.786 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.787 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.860 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.861 182729 DEBUG nova.virt.disk.api [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.862 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.928 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.929 182729 DEBUG nova.virt.disk.api [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.929 182729 DEBUG nova.objects.instance [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid 74109741-c148-4605-9d06-89092c674c04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.944 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.944 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Ensure instance console log exists: /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.945 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.945 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:07 compute-0 nova_compute[182725]: 2026-01-22 22:41:07.945 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:08 compute-0 nova_compute[182725]: 2026-01-22 22:41:08.935 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:08 compute-0 nova_compute[182725]: 2026-01-22 22:41:08.952 182729 DEBUG nova.network.neutron [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Successfully created port: 63af27f0-00f9-42b2-8238-747b706a37b9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:41:09 compute-0 nova_compute[182725]: 2026-01-22 22:41:09.226 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:09 compute-0 nova_compute[182725]: 2026-01-22 22:41:09.956 182729 DEBUG nova.network.neutron [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Successfully updated port: 63af27f0-00f9-42b2-8238-747b706a37b9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:41:09 compute-0 nova_compute[182725]: 2026-01-22 22:41:09.978 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:41:09 compute-0 nova_compute[182725]: 2026-01-22 22:41:09.979 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:41:09 compute-0 nova_compute[182725]: 2026-01-22 22:41:09.979 182729 DEBUG nova.network.neutron [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:41:10 compute-0 nova_compute[182725]: 2026-01-22 22:41:10.094 182729 DEBUG nova.compute.manager [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-changed-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:10 compute-0 nova_compute[182725]: 2026-01-22 22:41:10.095 182729 DEBUG nova.compute.manager [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Refreshing instance network info cache due to event network-changed-63af27f0-00f9-42b2-8238-747b706a37b9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:41:10 compute-0 nova_compute[182725]: 2026-01-22 22:41:10.095 182729 DEBUG oslo_concurrency.lockutils [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:41:10 compute-0 nova_compute[182725]: 2026-01-22 22:41:10.134 182729 DEBUG nova.network.neutron [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.016 182729 DEBUG nova.network.neutron [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Updating instance_info_cache with network_info: [{"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.038 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.039 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Instance network_info: |[{"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.040 182729 DEBUG oslo_concurrency.lockutils [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.041 182729 DEBUG nova.network.neutron [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Refreshing network info cache for port 63af27f0-00f9-42b2-8238-747b706a37b9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.047 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Start _get_guest_xml network_info=[{"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.055 182729 WARNING nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.060 182729 DEBUG nova.virt.libvirt.host [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.061 182729 DEBUG nova.virt.libvirt.host [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.071 182729 DEBUG nova.virt.libvirt.host [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.072 182729 DEBUG nova.virt.libvirt.host [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.074 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.075 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.076 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.076 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.077 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.077 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.078 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.078 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.079 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.079 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.079 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.080 182729 DEBUG nova.virt.hardware [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.091 182729 DEBUG nova.virt.libvirt.vif [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1025859772',display_name='tempest-TestNetworkBasicOps-server-1025859772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1025859772',id=135,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLBZCXz6fycQZFoL56FerhyIuHcQG09XuJEwm5KobXBGredHqQ3oQAQNN8PZiUbbZVPazeINs1jU9paRqiDEAlumnLbtTwFnXODFBtHbILH1yhQvEihQYo8uEwEmNVj7AA==',key_name='tempest-TestNetworkBasicOps-386073472',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-c81g4iho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:07Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=74109741-c148-4605-9d06-89092c674c04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.092 182729 DEBUG nova.network.os_vif_util [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.093 182729 DEBUG nova.network.os_vif_util [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:ea:13,bridge_name='br-int',has_traffic_filtering=True,id=63af27f0-00f9-42b2-8238-747b706a37b9,network=Network(acc24254-349c-445c-aeed-f8eb3ec4b92e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63af27f0-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.094 182729 DEBUG nova.objects.instance [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 74109741-c148-4605-9d06-89092c674c04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.115 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <uuid>74109741-c148-4605-9d06-89092c674c04</uuid>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <name>instance-00000087</name>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkBasicOps-server-1025859772</nova:name>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:41:11</nova:creationTime>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         <nova:port uuid="63af27f0-00f9-42b2-8238-747b706a37b9">
Jan 22 22:41:11 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <system>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <entry name="serial">74109741-c148-4605-9d06-89092c674c04</entry>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <entry name="uuid">74109741-c148-4605-9d06-89092c674c04</entry>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </system>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <os>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   </os>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <features>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   </features>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk.config"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:89:ea:13"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <target dev="tap63af27f0-00"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/console.log" append="off"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <video>
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </video>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:41:11 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:41:11 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:41:11 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:41:11 compute-0 nova_compute[182725]: </domain>
Jan 22 22:41:11 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.117 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Preparing to wait for external event network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.117 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.118 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.118 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.119 182729 DEBUG nova.virt.libvirt.vif [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1025859772',display_name='tempest-TestNetworkBasicOps-server-1025859772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1025859772',id=135,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLBZCXz6fycQZFoL56FerhyIuHcQG09XuJEwm5KobXBGredHqQ3oQAQNN8PZiUbbZVPazeINs1jU9paRqiDEAlumnLbtTwFnXODFBtHbILH1yhQvEihQYo8uEwEmNVj7AA==',key_name='tempest-TestNetworkBasicOps-386073472',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-c81g4iho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:07Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=74109741-c148-4605-9d06-89092c674c04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.119 182729 DEBUG nova.network.os_vif_util [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.120 182729 DEBUG nova.network.os_vif_util [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:ea:13,bridge_name='br-int',has_traffic_filtering=True,id=63af27f0-00f9-42b2-8238-747b706a37b9,network=Network(acc24254-349c-445c-aeed-f8eb3ec4b92e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63af27f0-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.121 182729 DEBUG os_vif [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:ea:13,bridge_name='br-int',has_traffic_filtering=True,id=63af27f0-00f9-42b2-8238-747b706a37b9,network=Network(acc24254-349c-445c-aeed-f8eb3ec4b92e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63af27f0-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.121 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.122 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.122 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.125 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.126 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63af27f0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.126 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63af27f0-00, col_values=(('external_ids', {'iface-id': '63af27f0-00f9-42b2-8238-747b706a37b9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:ea:13', 'vm-uuid': '74109741-c148-4605-9d06-89092c674c04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.128 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:11 compute-0 NetworkManager[54954]: <info>  [1769121671.1298] manager: (tap63af27f0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.131 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.137 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.138 182729 INFO os_vif [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:ea:13,bridge_name='br-int',has_traffic_filtering=True,id=63af27f0-00f9-42b2-8238-747b706a37b9,network=Network(acc24254-349c-445c-aeed-f8eb3ec4b92e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63af27f0-00')
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.197 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.198 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.198 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:89:ea:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.199 182729 INFO nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Using config drive
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.840 182729 INFO nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Creating config drive at /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk.config
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.846 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnfqddrkk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:11 compute-0 nova_compute[182725]: 2026-01-22 22:41:11.990 182729 DEBUG oslo_concurrency.processutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnfqddrkk" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:12 compute-0 kernel: tap63af27f0-00: entered promiscuous mode
Jan 22 22:41:12 compute-0 NetworkManager[54954]: <info>  [1769121672.0824] manager: (tap63af27f0-00): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Jan 22 22:41:12 compute-0 ovn_controller[94850]: 2026-01-22T22:41:12Z|00544|binding|INFO|Claiming lport 63af27f0-00f9-42b2-8238-747b706a37b9 for this chassis.
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.084 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 ovn_controller[94850]: 2026-01-22T22:41:12Z|00545|binding|INFO|63af27f0-00f9-42b2-8238-747b706a37b9: Claiming fa:16:3e:89:ea:13 10.100.0.12
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.092 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.111 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:ea:13 10.100.0.12'], port_security=['fa:16:3e:89:ea:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '74109741-c148-4605-9d06-89092c674c04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6c5c79c1-e9f8-43a3-b41e-c9de2dd5a501', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=624f7740-abec-4eab-ab48-c44640782167, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=63af27f0-00f9-42b2-8238-747b706a37b9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.113 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 63af27f0-00f9-42b2-8238-747b706a37b9 in datapath acc24254-349c-445c-aeed-f8eb3ec4b92e bound to our chassis
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.116 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network acc24254-349c-445c-aeed-f8eb3ec4b92e
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.135 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b85292b5-8392-4f57-8715-1481e5d9ecf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.136 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapacc24254-31 in ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.140 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapacc24254-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.140 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0119e178-96a1-42a6-aefc-ce965c863ef8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 systemd-udevd[231399]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.141 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7339bd71-8996-47da-b625-2e3e00e4696c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 systemd-machined[154006]: New machine qemu-62-instance-00000087.
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.161 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[d25b7260-7242-497b-9c78-1e05f7a8aeb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 NetworkManager[54954]: <info>  [1769121672.1672] device (tap63af27f0-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:41:12 compute-0 NetworkManager[54954]: <info>  [1769121672.1697] device (tap63af27f0-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.170 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 ovn_controller[94850]: 2026-01-22T22:41:12Z|00546|binding|INFO|Setting lport 63af27f0-00f9-42b2-8238-747b706a37b9 ovn-installed in OVS
Jan 22 22:41:12 compute-0 ovn_controller[94850]: 2026-01-22T22:41:12Z|00547|binding|INFO|Setting lport 63af27f0-00f9-42b2-8238-747b706a37b9 up in Southbound
Jan 22 22:41:12 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-00000087.
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.178 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.182 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1652f9aa-d9d2-4516-8a97-1e5cd88b8822]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.215 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[635ef38e-9292-4509-9214-1d6bfc78eba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.221 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8e81f5a6-345a-4497-ae91-6744d847c617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 NetworkManager[54954]: <info>  [1769121672.2228] manager: (tapacc24254-30): new Veth device (/org/freedesktop/NetworkManager/Devices/251)
Jan 22 22:41:12 compute-0 systemd-udevd[231403]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.250 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b0603bd8-e95c-41a4-8664-51092dcb18fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.254 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fe308d97-5f29-4f5c-80f3-b6ed2b4ff9f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 NetworkManager[54954]: <info>  [1769121672.2739] device (tapacc24254-30): carrier: link connected
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.277 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1da1421f-94c1-44f9-938b-e163a58ed7f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.292 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ccadfc92-a2e2-4d12-b524-f45ba0d1b587]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapacc24254-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:5c:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530987, 'reachable_time': 18894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231431, 'error': None, 'target': 'ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.307 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc06132-6443-4775-b832-7443ce653685]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:5c3a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530987, 'tstamp': 530987}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231433, 'error': None, 'target': 'ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.321 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7f12c97a-bea2-4b33-b03e-eafa7fe3f42f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapacc24254-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:5c:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530987, 'reachable_time': 18894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231434, 'error': None, 'target': 'ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.351 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[39d30ee4-f55a-4780-a2bd-736e31772eb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.405 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[246ce921-02ee-40f8-8f68-859dfa556ef2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.407 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacc24254-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.407 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.407 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapacc24254-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.409 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 NetworkManager[54954]: <info>  [1769121672.4098] manager: (tapacc24254-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 22 22:41:12 compute-0 kernel: tapacc24254-30: entered promiscuous mode
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.412 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.412 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapacc24254-30, col_values=(('external_ids', {'iface-id': 'be01ff0b-c56d-4692-bcfa-5b4b56e84a06'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.413 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 ovn_controller[94850]: 2026-01-22T22:41:12Z|00548|binding|INFO|Releasing lport be01ff0b-c56d-4692-bcfa-5b4b56e84a06 from this chassis (sb_readonly=0)
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.424 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.424 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/acc24254-349c-445c-aeed-f8eb3ec4b92e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/acc24254-349c-445c-aeed-f8eb3ec4b92e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.425 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82378c0c-57b6-4f64-ac51-4f07fd6962cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.426 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-acc24254-349c-445c-aeed-f8eb3ec4b92e
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/acc24254-349c-445c-aeed-f8eb3ec4b92e.pid.haproxy
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID acc24254-349c-445c-aeed-f8eb3ec4b92e
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.427 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'env', 'PROCESS_TAG=haproxy-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/acc24254-349c-445c-aeed-f8eb3ec4b92e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.449 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.450 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:12.450 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.573 182729 DEBUG nova.network.neutron [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Updated VIF entry in instance network info cache for port 63af27f0-00f9-42b2-8238-747b706a37b9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.574 182729 DEBUG nova.network.neutron [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Updating instance_info_cache with network_info: [{"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.595 182729 DEBUG oslo_concurrency.lockutils [req-b622c8b2-c149-4845-be2d-4f3028996f49 req-0fffd31f-acf2-4269-aa52-d0369e2bc8d8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.685 182729 DEBUG nova.compute.manager [req-8e9d9be0-3a93-471f-8c5f-0ac318b85157 req-3842d9ec-c8bd-4b80-b6f2-6d35f5ba93e3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.686 182729 DEBUG oslo_concurrency.lockutils [req-8e9d9be0-3a93-471f-8c5f-0ac318b85157 req-3842d9ec-c8bd-4b80-b6f2-6d35f5ba93e3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.686 182729 DEBUG oslo_concurrency.lockutils [req-8e9d9be0-3a93-471f-8c5f-0ac318b85157 req-3842d9ec-c8bd-4b80-b6f2-6d35f5ba93e3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.687 182729 DEBUG oslo_concurrency.lockutils [req-8e9d9be0-3a93-471f-8c5f-0ac318b85157 req-3842d9ec-c8bd-4b80-b6f2-6d35f5ba93e3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.687 182729 DEBUG nova.compute.manager [req-8e9d9be0-3a93-471f-8c5f-0ac318b85157 req-3842d9ec-c8bd-4b80-b6f2-6d35f5ba93e3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Processing event network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:41:12 compute-0 podman[231470]: 2026-01-22 22:41:12.79616943 +0000 UTC m=+0.059479936 container create 10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.802 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.804 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121672.802085, 74109741-c148-4605-9d06-89092c674c04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.805 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] VM Started (Lifecycle Event)
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.808 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.818 182729 INFO nova.virt.libvirt.driver [-] [instance: 74109741-c148-4605-9d06-89092c674c04] Instance spawned successfully.
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.819 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.839 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.845 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:41:12 compute-0 systemd[1]: Started libpod-conmon-10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b.scope.
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.850 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.851 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.851 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.852 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.852 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.852 182729 DEBUG nova.virt.libvirt.driver [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:12 compute-0 podman[231470]: 2026-01-22 22:41:12.762441088 +0000 UTC m=+0.025751614 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.878 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.879 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121672.803367, 74109741-c148-4605-9d06-89092c674c04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.879 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] VM Paused (Lifecycle Event)
Jan 22 22:41:12 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:41:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15a006ed379b590d07aff1ea2f9a57532054fea5d4dc903cf752e2e125d19173/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:41:12 compute-0 podman[231470]: 2026-01-22 22:41:12.906590698 +0000 UTC m=+0.169901224 container init 10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.911 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:12 compute-0 podman[231470]: 2026-01-22 22:41:12.912348162 +0000 UTC m=+0.175658658 container start 10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.915 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121672.809068, 74109741-c148-4605-9d06-89092c674c04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.915 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] VM Resumed (Lifecycle Event)
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.933 182729 INFO nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Took 5.33 seconds to spawn the instance on the hypervisor.
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.934 182729 DEBUG nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:12 compute-0 neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e[231486]: [NOTICE]   (231490) : New worker (231492) forked
Jan 22 22:41:12 compute-0 neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e[231486]: [NOTICE]   (231490) : Loading success.
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.936 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.946 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:41:12 compute-0 nova_compute[182725]: 2026-01-22 22:41:12.977 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:41:13 compute-0 nova_compute[182725]: 2026-01-22 22:41:13.066 182729 INFO nova.compute.manager [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Took 5.87 seconds to build instance.
Jan 22 22:41:13 compute-0 nova_compute[182725]: 2026-01-22 22:41:13.081 182729 DEBUG oslo_concurrency.lockutils [None req-b46c26dd-b97c-4675-a7d4-fcefbab66f6d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.984s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:13 compute-0 nova_compute[182725]: 2026-01-22 22:41:13.897 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121658.8958702, 254e913f-3968-436b-afcc-e51c2350b232 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:13 compute-0 nova_compute[182725]: 2026-01-22 22:41:13.898 182729 INFO nova.compute.manager [-] [instance: 254e913f-3968-436b-afcc-e51c2350b232] VM Stopped (Lifecycle Event)
Jan 22 22:41:13 compute-0 nova_compute[182725]: 2026-01-22 22:41:13.939 182729 DEBUG nova.compute.manager [None req-19d042e6-37b7-40bf-b8c2-c3238c5114cb - - - - - -] [instance: 254e913f-3968-436b-afcc-e51c2350b232] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:14 compute-0 nova_compute[182725]: 2026-01-22 22:41:14.229 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:14 compute-0 nova_compute[182725]: 2026-01-22 22:41:14.828 182729 DEBUG nova.compute.manager [req-b5431b3f-37b5-4aa9-9bcc-1a6cf6e54ba3 req-ad211c90-ea13-49dc-922d-5f30efd42d6c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:14 compute-0 nova_compute[182725]: 2026-01-22 22:41:14.829 182729 DEBUG oslo_concurrency.lockutils [req-b5431b3f-37b5-4aa9-9bcc-1a6cf6e54ba3 req-ad211c90-ea13-49dc-922d-5f30efd42d6c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:14 compute-0 nova_compute[182725]: 2026-01-22 22:41:14.829 182729 DEBUG oslo_concurrency.lockutils [req-b5431b3f-37b5-4aa9-9bcc-1a6cf6e54ba3 req-ad211c90-ea13-49dc-922d-5f30efd42d6c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:14 compute-0 nova_compute[182725]: 2026-01-22 22:41:14.829 182729 DEBUG oslo_concurrency.lockutils [req-b5431b3f-37b5-4aa9-9bcc-1a6cf6e54ba3 req-ad211c90-ea13-49dc-922d-5f30efd42d6c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:14 compute-0 nova_compute[182725]: 2026-01-22 22:41:14.829 182729 DEBUG nova.compute.manager [req-b5431b3f-37b5-4aa9-9bcc-1a6cf6e54ba3 req-ad211c90-ea13-49dc-922d-5f30efd42d6c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] No waiting events found dispatching network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:41:14 compute-0 nova_compute[182725]: 2026-01-22 22:41:14.830 182729 WARNING nova.compute.manager [req-b5431b3f-37b5-4aa9-9bcc-1a6cf6e54ba3 req-ad211c90-ea13-49dc-922d-5f30efd42d6c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received unexpected event network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 for instance with vm_state active and task_state None.
Jan 22 22:41:15 compute-0 podman[231502]: 2026-01-22 22:41:15.130406033 +0000 UTC m=+0.060696016 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 22:41:15 compute-0 podman[231501]: 2026-01-22 22:41:15.13069115 +0000 UTC m=+0.066952783 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:41:15 compute-0 podman[231503]: 2026-01-22 22:41:15.138854574 +0000 UTC m=+0.067356373 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:41:16 compute-0 nova_compute[182725]: 2026-01-22 22:41:16.129 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:16 compute-0 nova_compute[182725]: 2026-01-22 22:41:16.875 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:16 compute-0 NetworkManager[54954]: <info>  [1769121676.8806] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 22 22:41:16 compute-0 NetworkManager[54954]: <info>  [1769121676.8829] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Jan 22 22:41:16 compute-0 nova_compute[182725]: 2026-01-22 22:41:16.987 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:16 compute-0 ovn_controller[94850]: 2026-01-22T22:41:16Z|00549|binding|INFO|Releasing lport be01ff0b-c56d-4692-bcfa-5b4b56e84a06 from this chassis (sb_readonly=0)
Jan 22 22:41:17 compute-0 nova_compute[182725]: 2026-01-22 22:41:17.018 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:17 compute-0 nova_compute[182725]: 2026-01-22 22:41:17.556 182729 DEBUG nova.compute.manager [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-changed-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:17 compute-0 nova_compute[182725]: 2026-01-22 22:41:17.558 182729 DEBUG nova.compute.manager [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Refreshing instance network info cache due to event network-changed-63af27f0-00f9-42b2-8238-747b706a37b9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:41:17 compute-0 nova_compute[182725]: 2026-01-22 22:41:17.559 182729 DEBUG oslo_concurrency.lockutils [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:41:17 compute-0 nova_compute[182725]: 2026-01-22 22:41:17.560 182729 DEBUG oslo_concurrency.lockutils [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:41:17 compute-0 nova_compute[182725]: 2026-01-22 22:41:17.560 182729 DEBUG nova.network.neutron [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Refreshing network info cache for port 63af27f0-00f9-42b2-8238-747b706a37b9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:41:17 compute-0 nova_compute[182725]: 2026-01-22 22:41:17.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:19 compute-0 nova_compute[182725]: 2026-01-22 22:41:19.121 182729 DEBUG nova.network.neutron [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Updated VIF entry in instance network info cache for port 63af27f0-00f9-42b2-8238-747b706a37b9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:41:19 compute-0 nova_compute[182725]: 2026-01-22 22:41:19.124 182729 DEBUG nova.network.neutron [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Updating instance_info_cache with network_info: [{"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:41:19 compute-0 nova_compute[182725]: 2026-01-22 22:41:19.144 182729 DEBUG oslo_concurrency.lockutils [req-8815194c-83ab-43b4-a6e0-7a1ce6426cb3 req-6fddf845-4858-4742-a4e7-a20d18579961 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:41:19 compute-0 nova_compute[182725]: 2026-01-22 22:41:19.232 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:20 compute-0 nova_compute[182725]: 2026-01-22 22:41:20.911 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:21 compute-0 nova_compute[182725]: 2026-01-22 22:41:21.132 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:24 compute-0 nova_compute[182725]: 2026-01-22 22:41:24.234 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:25 compute-0 ovn_controller[94850]: 2026-01-22T22:41:25Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:ea:13 10.100.0.12
Jan 22 22:41:25 compute-0 ovn_controller[94850]: 2026-01-22T22:41:25Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:ea:13 10.100.0.12
Jan 22 22:41:25 compute-0 nova_compute[182725]: 2026-01-22 22:41:25.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:25 compute-0 nova_compute[182725]: 2026-01-22 22:41:25.924 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:25 compute-0 nova_compute[182725]: 2026-01-22 22:41:25.925 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:25 compute-0 nova_compute[182725]: 2026-01-22 22:41:25.925 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:25 compute-0 nova_compute[182725]: 2026-01-22 22:41:25.926 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.018 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.122 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.124 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.153 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.220 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.408 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.410 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5520MB free_disk=73.30381774902344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.411 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.411 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.536 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 74109741-c148-4605-9d06-89092c674c04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.537 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.537 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.559 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.589 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.590 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.615 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.680 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.759 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.778 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.823 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:41:26 compute-0 nova_compute[182725]: 2026-01-22 22:41:26.824 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:27 compute-0 ovn_controller[94850]: 2026-01-22T22:41:27Z|00550|binding|INFO|Releasing lport be01ff0b-c56d-4692-bcfa-5b4b56e84a06 from this chassis (sb_readonly=0)
Jan 22 22:41:27 compute-0 nova_compute[182725]: 2026-01-22 22:41:27.610 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:27 compute-0 nova_compute[182725]: 2026-01-22 22:41:27.824 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:27 compute-0 nova_compute[182725]: 2026-01-22 22:41:27.825 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:41:27 compute-0 nova_compute[182725]: 2026-01-22 22:41:27.893 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:41:27 compute-0 nova_compute[182725]: 2026-01-22 22:41:27.894 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:29 compute-0 podman[231586]: 2026-01-22 22:41:29.145380199 +0000 UTC m=+0.075761333 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 22:41:29 compute-0 nova_compute[182725]: 2026-01-22 22:41:29.236 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:29 compute-0 nova_compute[182725]: 2026-01-22 22:41:29.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:30 compute-0 nova_compute[182725]: 2026-01-22 22:41:30.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:31 compute-0 nova_compute[182725]: 2026-01-22 22:41:31.157 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:32 compute-0 nova_compute[182725]: 2026-01-22 22:41:32.223 182729 INFO nova.compute.manager [None req-d4f8c2dc-2853-46e3-b652-45d6311a34bb b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Get console output
Jan 22 22:41:32 compute-0 nova_compute[182725]: 2026-01-22 22:41:32.233 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:41:33 compute-0 nova_compute[182725]: 2026-01-22 22:41:33.151 182729 INFO nova.compute.manager [None req-b54b1941-0204-43c7-81aa-01f24376a45a b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Get console output
Jan 22 22:41:33 compute-0 nova_compute[182725]: 2026-01-22 22:41:33.158 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:41:33 compute-0 nova_compute[182725]: 2026-01-22 22:41:33.930 182729 DEBUG nova.compute.manager [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-changed-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:33 compute-0 nova_compute[182725]: 2026-01-22 22:41:33.931 182729 DEBUG nova.compute.manager [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Refreshing instance network info cache due to event network-changed-63af27f0-00f9-42b2-8238-747b706a37b9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:41:33 compute-0 nova_compute[182725]: 2026-01-22 22:41:33.931 182729 DEBUG oslo_concurrency.lockutils [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:41:33 compute-0 nova_compute[182725]: 2026-01-22 22:41:33.932 182729 DEBUG oslo_concurrency.lockutils [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:41:33 compute-0 nova_compute[182725]: 2026-01-22 22:41:33.932 182729 DEBUG nova.network.neutron [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Refreshing network info cache for port 63af27f0-00f9-42b2-8238-747b706a37b9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.011 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.012 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.012 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.013 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.013 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.028 182729 INFO nova.compute.manager [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Terminating instance
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.040 182729 DEBUG nova.compute.manager [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:41:34 compute-0 kernel: tap63af27f0-00 (unregistering): left promiscuous mode
Jan 22 22:41:34 compute-0 NetworkManager[54954]: <info>  [1769121694.0648] device (tap63af27f0-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:41:34 compute-0 ovn_controller[94850]: 2026-01-22T22:41:34Z|00551|binding|INFO|Releasing lport 63af27f0-00f9-42b2-8238-747b706a37b9 from this chassis (sb_readonly=0)
Jan 22 22:41:34 compute-0 ovn_controller[94850]: 2026-01-22T22:41:34Z|00552|binding|INFO|Setting lport 63af27f0-00f9-42b2-8238-747b706a37b9 down in Southbound
Jan 22 22:41:34 compute-0 ovn_controller[94850]: 2026-01-22T22:41:34Z|00553|binding|INFO|Removing iface tap63af27f0-00 ovn-installed in OVS
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.078 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.086 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:ea:13 10.100.0.12'], port_security=['fa:16:3e:89:ea:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '74109741-c148-4605-9d06-89092c674c04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6c5c79c1-e9f8-43a3-b41e-c9de2dd5a501', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=624f7740-abec-4eab-ab48-c44640782167, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=63af27f0-00f9-42b2-8238-747b706a37b9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.087 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 63af27f0-00f9-42b2-8238-747b706a37b9 in datapath acc24254-349c-445c-aeed-f8eb3ec4b92e unbound from our chassis
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.096 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network acc24254-349c-445c-aeed-f8eb3ec4b92e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.098 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aa00c3a6-b30b-4635-9c2a-9e7805afb73b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.103 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e namespace which is not needed anymore
Jan 22 22:41:34 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000087.scope: Deactivated successfully.
Jan 22 22:41:34 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000087.scope: Consumed 13.175s CPU time.
Jan 22 22:41:34 compute-0 systemd-machined[154006]: Machine qemu-62-instance-00000087 terminated.
Jan 22 22:41:34 compute-0 podman[231609]: 2026-01-22 22:41:34.187360853 +0000 UTC m=+0.090401948 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 22 22:41:34 compute-0 podman[231606]: 2026-01-22 22:41:34.22047945 +0000 UTC m=+0.136245203 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.237 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.272 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.278 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e[231486]: [NOTICE]   (231490) : haproxy version is 2.8.14-c23fe91
Jan 22 22:41:34 compute-0 neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e[231486]: [NOTICE]   (231490) : path to executable is /usr/sbin/haproxy
Jan 22 22:41:34 compute-0 neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e[231486]: [WARNING]  (231490) : Exiting Master process...
Jan 22 22:41:34 compute-0 neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e[231486]: [ALERT]    (231490) : Current worker (231492) exited with code 143 (Terminated)
Jan 22 22:41:34 compute-0 neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e[231486]: [WARNING]  (231490) : All workers exited. Exiting... (0)
Jan 22 22:41:34 compute-0 systemd[1]: libpod-10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b.scope: Deactivated successfully.
Jan 22 22:41:34 compute-0 podman[231673]: 2026-01-22 22:41:34.314262392 +0000 UTC m=+0.081836534 container died 10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.317 182729 INFO nova.virt.libvirt.driver [-] [instance: 74109741-c148-4605-9d06-89092c674c04] Instance destroyed successfully.
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.318 182729 DEBUG nova.objects.instance [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid 74109741-c148-4605-9d06-89092c674c04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.330 182729 DEBUG nova.virt.libvirt.vif [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:41:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1025859772',display_name='tempest-TestNetworkBasicOps-server-1025859772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1025859772',id=135,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLBZCXz6fycQZFoL56FerhyIuHcQG09XuJEwm5KobXBGredHqQ3oQAQNN8PZiUbbZVPazeINs1jU9paRqiDEAlumnLbtTwFnXODFBtHbILH1yhQvEihQYo8uEwEmNVj7AA==',key_name='tempest-TestNetworkBasicOps-386073472',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:41:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-c81g4iho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:41:12Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=74109741-c148-4605-9d06-89092c674c04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.331 182729 DEBUG nova.network.os_vif_util [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.332 182729 DEBUG nova.network.os_vif_util [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:ea:13,bridge_name='br-int',has_traffic_filtering=True,id=63af27f0-00f9-42b2-8238-747b706a37b9,network=Network(acc24254-349c-445c-aeed-f8eb3ec4b92e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63af27f0-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.332 182729 DEBUG os_vif [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:ea:13,bridge_name='br-int',has_traffic_filtering=True,id=63af27f0-00f9-42b2-8238-747b706a37b9,network=Network(acc24254-349c-445c-aeed-f8eb3ec4b92e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63af27f0-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.334 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.335 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63af27f0-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.336 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.340 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:41:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b-userdata-shm.mount: Deactivated successfully.
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.345 182729 INFO os_vif [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:ea:13,bridge_name='br-int',has_traffic_filtering=True,id=63af27f0-00f9-42b2-8238-747b706a37b9,network=Network(acc24254-349c-445c-aeed-f8eb3ec4b92e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63af27f0-00')
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.347 182729 INFO nova.virt.libvirt.driver [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Deleting instance files /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04_del
Jan 22 22:41:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-15a006ed379b590d07aff1ea2f9a57532054fea5d4dc903cf752e2e125d19173-merged.mount: Deactivated successfully.
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.348 182729 INFO nova.virt.libvirt.driver [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Deletion of /var/lib/nova/instances/74109741-c148-4605-9d06-89092c674c04_del complete
Jan 22 22:41:34 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:41:34 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:41:34 compute-0 podman[231673]: 2026-01-22 22:41:34.362248681 +0000 UTC m=+0.129822843 container cleanup 10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 22:41:34 compute-0 systemd[1]: libpod-conmon-10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b.scope: Deactivated successfully.
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.426 182729 INFO nova.compute.manager [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.427 182729 DEBUG oslo.service.loopingcall [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.427 182729 DEBUG nova.compute.manager [-] [instance: 74109741-c148-4605-9d06-89092c674c04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.428 182729 DEBUG nova.network.neutron [-] [instance: 74109741-c148-4605-9d06-89092c674c04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:41:34 compute-0 podman[231719]: 2026-01-22 22:41:34.442171397 +0000 UTC m=+0.049016955 container remove 10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.448 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a51af2-cb73-454a-a74f-6e9ee1d70fc8]: (4, ('Thu Jan 22 10:41:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e (10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b)\n10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b\nThu Jan 22 10:41:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e (10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b)\n10178723118f0b2a960241c368d6b392e1a501d84006c5eaf528bff38c19387b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.450 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d8988596-b826-43bd-ba68-8c695d109d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.451 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacc24254-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:34 compute-0 kernel: tapacc24254-30: left promiscuous mode
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.453 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.469 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.472 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bec455cc-56ca-4384-8b38-bfe1983d99f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.489 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[68743e92-b8a1-4e57-8c97-2b3cf4979a24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.491 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef21655-d508-4420-8432-cb5ba9328fa1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.510 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3eb184-8ad3-4907-a4e8-d77677c0700d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530981, 'reachable_time': 41738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231734, 'error': None, 'target': 'ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 systemd[1]: run-netns-ovnmeta\x2dacc24254\x2d349c\x2d445c\x2daeed\x2df8eb3ec4b92e.mount: Deactivated successfully.
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.514 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-acc24254-349c-445c-aeed-f8eb3ec4b92e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:41:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:34.514 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[b9dc8064-46cc-4f4b-82ef-7335e09efc12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:34 compute-0 nova_compute[182725]: 2026-01-22 22:41:34.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.080 182729 DEBUG nova.compute.manager [req-bc0f8a99-98a9-484b-a083-c854cd0c07ee req-23bc9bd6-381c-4cf9-9fa3-7d4e5991624a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-vif-unplugged-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.080 182729 DEBUG oslo_concurrency.lockutils [req-bc0f8a99-98a9-484b-a083-c854cd0c07ee req-23bc9bd6-381c-4cf9-9fa3-7d4e5991624a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.080 182729 DEBUG oslo_concurrency.lockutils [req-bc0f8a99-98a9-484b-a083-c854cd0c07ee req-23bc9bd6-381c-4cf9-9fa3-7d4e5991624a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.081 182729 DEBUG oslo_concurrency.lockutils [req-bc0f8a99-98a9-484b-a083-c854cd0c07ee req-23bc9bd6-381c-4cf9-9fa3-7d4e5991624a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.081 182729 DEBUG nova.compute.manager [req-bc0f8a99-98a9-484b-a083-c854cd0c07ee req-23bc9bd6-381c-4cf9-9fa3-7d4e5991624a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] No waiting events found dispatching network-vif-unplugged-63af27f0-00f9-42b2-8238-747b706a37b9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.081 182729 DEBUG nova.compute.manager [req-bc0f8a99-98a9-484b-a083-c854cd0c07ee req-23bc9bd6-381c-4cf9-9fa3-7d4e5991624a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-vif-unplugged-63af27f0-00f9-42b2-8238-747b706a37b9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.516 182729 DEBUG nova.network.neutron [-] [instance: 74109741-c148-4605-9d06-89092c674c04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.538 182729 INFO nova.compute.manager [-] [instance: 74109741-c148-4605-9d06-89092c674c04] Took 1.11 seconds to deallocate network for instance.
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.601 182729 DEBUG nova.compute.manager [req-a6be6f54-b261-4c1b-a4d8-76a0c7a5c19e req-87ce4f80-1432-4483-95be-6e014ad5c97d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-vif-deleted-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.632 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.633 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.715 182729 DEBUG nova.compute.provider_tree [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.750 182729 DEBUG nova.scheduler.client.report [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.795 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.852 182729 INFO nova.scheduler.client.report [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance 74109741-c148-4605-9d06-89092c674c04
Jan 22 22:41:35 compute-0 nova_compute[182725]: 2026-01-22 22:41:35.954 182729 DEBUG oslo_concurrency.lockutils [None req-533fc4ae-6259-4ab0-8960-7893af91f158 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.942s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:36 compute-0 nova_compute[182725]: 2026-01-22 22:41:36.248 182729 DEBUG nova.network.neutron [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Updated VIF entry in instance network info cache for port 63af27f0-00f9-42b2-8238-747b706a37b9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:41:36 compute-0 nova_compute[182725]: 2026-01-22 22:41:36.249 182729 DEBUG nova.network.neutron [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Updating instance_info_cache with network_info: [{"id": "63af27f0-00f9-42b2-8238-747b706a37b9", "address": "fa:16:3e:89:ea:13", "network": {"id": "acc24254-349c-445c-aeed-f8eb3ec4b92e", "bridge": "br-int", "label": "tempest-network-smoke--415976130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63af27f0-00", "ovs_interfaceid": "63af27f0-00f9-42b2-8238-747b706a37b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:41:36 compute-0 nova_compute[182725]: 2026-01-22 22:41:36.273 182729 DEBUG oslo_concurrency.lockutils [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-74109741-c148-4605-9d06-89092c674c04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:41:36 compute-0 nova_compute[182725]: 2026-01-22 22:41:36.485 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:36.486 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:41:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:36.487 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:41:36 compute-0 nova_compute[182725]: 2026-01-22 22:41:36.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:37 compute-0 nova_compute[182725]: 2026-01-22 22:41:37.217 182729 DEBUG nova.compute.manager [req-0049db30-fc23-42a6-9943-c26e83136920 req-d3dd999d-37ac-4f13-bf65-07b2fdc32d3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received event network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:37 compute-0 nova_compute[182725]: 2026-01-22 22:41:37.217 182729 DEBUG oslo_concurrency.lockutils [req-0049db30-fc23-42a6-9943-c26e83136920 req-d3dd999d-37ac-4f13-bf65-07b2fdc32d3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "74109741-c148-4605-9d06-89092c674c04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:37 compute-0 nova_compute[182725]: 2026-01-22 22:41:37.218 182729 DEBUG oslo_concurrency.lockutils [req-0049db30-fc23-42a6-9943-c26e83136920 req-d3dd999d-37ac-4f13-bf65-07b2fdc32d3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:37 compute-0 nova_compute[182725]: 2026-01-22 22:41:37.218 182729 DEBUG oslo_concurrency.lockutils [req-0049db30-fc23-42a6-9943-c26e83136920 req-d3dd999d-37ac-4f13-bf65-07b2fdc32d3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "74109741-c148-4605-9d06-89092c674c04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:37 compute-0 nova_compute[182725]: 2026-01-22 22:41:37.218 182729 DEBUG nova.compute.manager [req-0049db30-fc23-42a6-9943-c26e83136920 req-d3dd999d-37ac-4f13-bf65-07b2fdc32d3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] No waiting events found dispatching network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:41:37 compute-0 nova_compute[182725]: 2026-01-22 22:41:37.219 182729 WARNING nova.compute.manager [req-0049db30-fc23-42a6-9943-c26e83136920 req-d3dd999d-37ac-4f13-bf65-07b2fdc32d3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 74109741-c148-4605-9d06-89092c674c04] Received unexpected event network-vif-plugged-63af27f0-00f9-42b2-8238-747b706a37b9 for instance with vm_state deleted and task_state None.
Jan 22 22:41:37 compute-0 nova_compute[182725]: 2026-01-22 22:41:37.873 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:39 compute-0 nova_compute[182725]: 2026-01-22 22:41:39.239 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:39 compute-0 nova_compute[182725]: 2026-01-22 22:41:39.337 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:40 compute-0 nova_compute[182725]: 2026-01-22 22:41:40.083 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:40 compute-0 nova_compute[182725]: 2026-01-22 22:41:40.271 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:41 compute-0 nova_compute[182725]: 2026-01-22 22:41:41.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:41:44 compute-0 nova_compute[182725]: 2026-01-22 22:41:44.241 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:44 compute-0 nova_compute[182725]: 2026-01-22 22:41:44.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:46 compute-0 podman[231738]: 2026-01-22 22:41:46.128583941 +0000 UTC m=+0.059348243 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:41:46 compute-0 podman[231736]: 2026-01-22 22:41:46.130478798 +0000 UTC m=+0.064325147 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:41:46 compute-0 podman[231737]: 2026-01-22 22:41:46.163360628 +0000 UTC m=+0.092388167 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 22:41:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:46.489 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:49 compute-0 nova_compute[182725]: 2026-01-22 22:41:49.242 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:49 compute-0 nova_compute[182725]: 2026-01-22 22:41:49.316 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121694.3152008, 74109741-c148-4605-9d06-89092c674c04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:49 compute-0 nova_compute[182725]: 2026-01-22 22:41:49.317 182729 INFO nova.compute.manager [-] [instance: 74109741-c148-4605-9d06-89092c674c04] VM Stopped (Lifecycle Event)
Jan 22 22:41:49 compute-0 nova_compute[182725]: 2026-01-22 22:41:49.334 182729 DEBUG nova.compute.manager [None req-c1c95561-5eeb-4fd7-a59a-9722e0cc6028 - - - - - -] [instance: 74109741-c148-4605-9d06-89092c674c04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:49 compute-0 nova_compute[182725]: 2026-01-22 22:41:49.340 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:53 compute-0 nova_compute[182725]: 2026-01-22 22:41:53.925 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:53 compute-0 nova_compute[182725]: 2026-01-22 22:41:53.926 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:53 compute-0 nova_compute[182725]: 2026-01-22 22:41:53.940 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.051 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.052 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.060 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.060 182729 INFO nova.compute.claims [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.198 182729 DEBUG nova.compute.provider_tree [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.213 182729 DEBUG nova.scheduler.client.report [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.237 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.239 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.245 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.293 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.293 182729 DEBUG nova.network.neutron [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.314 182729 INFO nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.335 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.341 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.439 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.440 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.441 182729 INFO nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Creating image(s)
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.441 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.442 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.442 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.455 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.506 182729 DEBUG nova.policy [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.510 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.511 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.511 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.524 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.580 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.581 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.616 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.617 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.618 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.674 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.676 182729 DEBUG nova.virt.disk.api [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.677 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.739 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.741 182729 DEBUG nova.virt.disk.api [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.741 182729 DEBUG nova.objects.instance [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid 11abf2b9-1bc0-4393-b971-0ee745aa1e75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.756 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.756 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Ensure instance console log exists: /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.757 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.758 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:54 compute-0 nova_compute[182725]: 2026-01-22 22:41:54.758 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:55 compute-0 nova_compute[182725]: 2026-01-22 22:41:55.244 182729 DEBUG nova.network.neutron [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Successfully created port: 452a5215-fc0f-4c85-bf69-268db34e744e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.111 182729 DEBUG nova.network.neutron [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Successfully updated port: 452a5215-fc0f-4c85-bf69-268db34e744e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.129 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.129 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.130 182729 DEBUG nova.network.neutron [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.311 182729 DEBUG nova.network.neutron [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.915 182729 DEBUG nova.compute.manager [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-changed-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.915 182729 DEBUG nova.compute.manager [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Refreshing instance network info cache due to event network-changed-452a5215-fc0f-4c85-bf69-268db34e744e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:41:56 compute-0 nova_compute[182725]: 2026-01-22 22:41:56.916 182729 DEBUG oslo_concurrency.lockutils [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.103 182729 DEBUG nova.network.neutron [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updating instance_info_cache with network_info: [{"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.120 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.121 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Instance network_info: |[{"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.121 182729 DEBUG oslo_concurrency.lockutils [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.121 182729 DEBUG nova.network.neutron [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Refreshing network info cache for port 452a5215-fc0f-4c85-bf69-268db34e744e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.124 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Start _get_guest_xml network_info=[{"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.129 182729 WARNING nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.135 182729 DEBUG nova.virt.libvirt.host [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.136 182729 DEBUG nova.virt.libvirt.host [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.145 182729 DEBUG nova.virt.libvirt.host [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.146 182729 DEBUG nova.virt.libvirt.host [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.147 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.147 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.148 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.148 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.148 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.149 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.149 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.149 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.150 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.150 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.150 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.150 182729 DEBUG nova.virt.hardware [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.155 182729 DEBUG nova.virt.libvirt.vif [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1369105219',display_name='tempest-TestNetworkBasicOps-server-1369105219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1369105219',id=138,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxvkzC9+TWiN5a2/sBgLprzCKD83Ww20/NvZIfxZvllzRwt6EzCq/7AQIXOMtNpn7QbLHYbNMDD7D0HxXG2533204Rxhwicpz/mT/IG8L6DsmSrpd3kkJgYuN5LW6KGMw==',key_name='tempest-TestNetworkBasicOps-815550840',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-ku7wikjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:54Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=11abf2b9-1bc0-4393-b971-0ee745aa1e75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.156 182729 DEBUG nova.network.os_vif_util [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.157 182729 DEBUG nova.network.os_vif_util [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:b5:16,bridge_name='br-int',has_traffic_filtering=True,id=452a5215-fc0f-4c85-bf69-268db34e744e,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap452a5215-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.158 182729 DEBUG nova.objects.instance [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11abf2b9-1bc0-4393-b971-0ee745aa1e75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.186 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <uuid>11abf2b9-1bc0-4393-b971-0ee745aa1e75</uuid>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <name>instance-0000008a</name>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkBasicOps-server-1369105219</nova:name>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:41:58</nova:creationTime>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         <nova:port uuid="452a5215-fc0f-4c85-bf69-268db34e744e">
Jan 22 22:41:58 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <system>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <entry name="serial">11abf2b9-1bc0-4393-b971-0ee745aa1e75</entry>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <entry name="uuid">11abf2b9-1bc0-4393-b971-0ee745aa1e75</entry>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </system>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <os>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   </os>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <features>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   </features>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.config"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:d3:b5:16"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <target dev="tap452a5215-fc"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/console.log" append="off"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <video>
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </video>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:41:58 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:41:58 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:41:58 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:41:58 compute-0 nova_compute[182725]: </domain>
Jan 22 22:41:58 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.188 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Preparing to wait for external event network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.188 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.189 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.189 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.190 182729 DEBUG nova.virt.libvirt.vif [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1369105219',display_name='tempest-TestNetworkBasicOps-server-1369105219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1369105219',id=138,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxvkzC9+TWiN5a2/sBgLprzCKD83Ww20/NvZIfxZvllzRwt6EzCq/7AQIXOMtNpn7QbLHYbNMDD7D0HxXG2533204Rxhwicpz/mT/IG8L6DsmSrpd3kkJgYuN5LW6KGMw==',key_name='tempest-TestNetworkBasicOps-815550840',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-ku7wikjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:54Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=11abf2b9-1bc0-4393-b971-0ee745aa1e75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.190 182729 DEBUG nova.network.os_vif_util [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.191 182729 DEBUG nova.network.os_vif_util [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:b5:16,bridge_name='br-int',has_traffic_filtering=True,id=452a5215-fc0f-4c85-bf69-268db34e744e,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap452a5215-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.191 182729 DEBUG os_vif [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:b5:16,bridge_name='br-int',has_traffic_filtering=True,id=452a5215-fc0f-4c85-bf69-268db34e744e,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap452a5215-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.191 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.192 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.192 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.196 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.196 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap452a5215-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.197 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap452a5215-fc, col_values=(('external_ids', {'iface-id': '452a5215-fc0f-4c85-bf69-268db34e744e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:b5:16', 'vm-uuid': '11abf2b9-1bc0-4393-b971-0ee745aa1e75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.199 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 NetworkManager[54954]: <info>  [1769121718.2003] manager: (tap452a5215-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.201 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.207 182729 INFO os_vif [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:b5:16,bridge_name='br-int',has_traffic_filtering=True,id=452a5215-fc0f-4c85-bf69-268db34e744e,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap452a5215-fc')
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.283 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.283 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.284 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:d3:b5:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.284 182729 INFO nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Using config drive
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.712 182729 INFO nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Creating config drive at /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.config
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.718 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_7ea3bv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.843 182729 DEBUG oslo_concurrency.processutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_7ea3bv" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:41:58 compute-0 kernel: tap452a5215-fc: entered promiscuous mode
Jan 22 22:41:58 compute-0 NetworkManager[54954]: <info>  [1769121718.9029] manager: (tap452a5215-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.903 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 ovn_controller[94850]: 2026-01-22T22:41:58Z|00554|binding|INFO|Claiming lport 452a5215-fc0f-4c85-bf69-268db34e744e for this chassis.
Jan 22 22:41:58 compute-0 ovn_controller[94850]: 2026-01-22T22:41:58Z|00555|binding|INFO|452a5215-fc0f-4c85-bf69-268db34e744e: Claiming fa:16:3e:d3:b5:16 10.100.0.12
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.906 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.919 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:b5:16 10.100.0.12'], port_security=['fa:16:3e:d3:b5:16 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '81c9bc76-4ce6-41d9-8955-8e38f4f633b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ebf5952-91d3-4d6e-a145-1401e7d14d3f, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=452a5215-fc0f-4c85-bf69-268db34e744e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.920 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 452a5215-fc0f-4c85-bf69-268db34e744e in datapath fd739554-520e-4e70-9045-bd1e5e1f0fe0 bound to our chassis
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.922 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd739554-520e-4e70-9045-bd1e5e1f0fe0
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.933 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf06c03-80d1-4578-bd20-34e62a9d4e96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.934 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd739554-51 in ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.935 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd739554-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.936 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[05b198fa-b212-4678-88e3-34d326d1aadb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.936 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a472af-b366-4fc3-8e34-7ddce9913c04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:58 compute-0 systemd-udevd[231838]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:41:58 compute-0 systemd-machined[154006]: New machine qemu-63-instance-0000008a.
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.948 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[e06f9a9d-8c2c-43a4-a834-f9fc5e2559d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:58 compute-0 NetworkManager[54954]: <info>  [1769121718.9557] device (tap452a5215-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:41:58 compute-0 NetworkManager[54954]: <info>  [1769121718.9569] device (tap452a5215-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.960 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-0000008a.
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.963 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 ovn_controller[94850]: 2026-01-22T22:41:58Z|00556|binding|INFO|Setting lport 452a5215-fc0f-4c85-bf69-268db34e744e ovn-installed in OVS
Jan 22 22:41:58 compute-0 ovn_controller[94850]: 2026-01-22T22:41:58Z|00557|binding|INFO|Setting lport 452a5215-fc0f-4c85-bf69-268db34e744e up in Southbound
Jan 22 22:41:58 compute-0 nova_compute[182725]: 2026-01-22 22:41:58.969 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:58.973 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6f1c81-7b0b-4f5d-a06e-903c0fdf99e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.005 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[faec5ac9-3190-48e5-aa7a-b61257d98683]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.010 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[49dd6f5c-1c0f-4c04-be89-7433a7e3eeb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 NetworkManager[54954]: <info>  [1769121719.0117] manager: (tapfd739554-50): new Veth device (/org/freedesktop/NetworkManager/Devices/257)
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.040 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[313284bc-af7d-42da-ae6b-d5e67c906f49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.042 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc53f22-9dab-4aff-a6ee-862c00276935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 NetworkManager[54954]: <info>  [1769121719.0629] device (tapfd739554-50): carrier: link connected
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.070 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[1a172ebb-b56b-4b7e-a274-dac01b981a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.085 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[763daae0-c417-427e-8a81-c85900ee3632]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd739554-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:e3:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535666, 'reachable_time': 42346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231871, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.097 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a351e725-1595-4d27-af80-39ee3209e927]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:e3f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535666, 'tstamp': 535666}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231872, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.114 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[060f81a7-da85-48b8-8902-a69847ce3753]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd739554-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:e3:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535666, 'reachable_time': 42346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231873, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.143 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[caa1d274-305b-4086-a89d-41cc3e352f4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.196 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[214f7675-5ea6-4d9b-abc5-3dcb61fdc1de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.198 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd739554-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.198 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.199 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd739554-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:59 compute-0 kernel: tapfd739554-50: entered promiscuous mode
Jan 22 22:41:59 compute-0 NetworkManager[54954]: <info>  [1769121719.2013] manager: (tapfd739554-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.206 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd739554-50, col_values=(('external_ids', {'iface-id': '545ef9e3-da19-466a-bbee-43bcf179d362'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.207 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:59 compute-0 ovn_controller[94850]: 2026-01-22T22:41:59Z|00558|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.207 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.210 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd739554-520e-4e70-9045-bd1e5e1f0fe0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd739554-520e-4e70-9045-bd1e5e1f0fe0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.211 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b95ff67b-1e54-4e41-a23c-d82d506784cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.211 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-fd739554-520e-4e70-9045-bd1e5e1f0fe0
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/fd739554-520e-4e70-9045-bd1e5e1f0fe0.pid.haproxy
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID fd739554-520e-4e70-9045-bd1e5e1f0fe0
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:41:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:41:59.213 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'env', 'PROCESS_TAG=haproxy-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd739554-520e-4e70-9045-bd1e5e1f0fe0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.219 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.247 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:41:59 compute-0 podman[231909]: 2026-01-22 22:41:59.597823048 +0000 UTC m=+0.054299787 container create 5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.607 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121719.6066046, 11abf2b9-1bc0-4393-b971-0ee745aa1e75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.608 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] VM Started (Lifecycle Event)
Jan 22 22:41:59 compute-0 systemd[1]: Started libpod-conmon-5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27.scope.
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.639 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.643 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121719.6080256, 11abf2b9-1bc0-4393-b971-0ee745aa1e75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.643 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] VM Paused (Lifecycle Event)
Jan 22 22:41:59 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:41:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8194613fe4bd533b801d6be8e99fed8bf214494fba33763c0c84a87f304b603/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.662 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.664 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:41:59 compute-0 podman[231909]: 2026-01-22 22:41:59.569488321 +0000 UTC m=+0.025965080 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:41:59 compute-0 podman[231909]: 2026-01-22 22:41:59.668036742 +0000 UTC m=+0.124513511 container init 5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:41:59 compute-0 podman[231909]: 2026-01-22 22:41:59.675892188 +0000 UTC m=+0.132368927 container start 5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 22:41:59 compute-0 podman[231924]: 2026-01-22 22:41:59.688583675 +0000 UTC m=+0.054596824 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.692 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:41:59 compute-0 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[231927]: [NOTICE]   (231946) : New worker (231949) forked
Jan 22 22:41:59 compute-0 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[231927]: [NOTICE]   (231946) : Loading success.
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.833 182729 DEBUG nova.compute.manager [req-afa73ffb-7277-4ad4-a30a-0d6d9bb000f2 req-58edb941-eec2-4c0f-ac6d-15cad46c7d38 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.834 182729 DEBUG oslo_concurrency.lockutils [req-afa73ffb-7277-4ad4-a30a-0d6d9bb000f2 req-58edb941-eec2-4c0f-ac6d-15cad46c7d38 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.834 182729 DEBUG oslo_concurrency.lockutils [req-afa73ffb-7277-4ad4-a30a-0d6d9bb000f2 req-58edb941-eec2-4c0f-ac6d-15cad46c7d38 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.834 182729 DEBUG oslo_concurrency.lockutils [req-afa73ffb-7277-4ad4-a30a-0d6d9bb000f2 req-58edb941-eec2-4c0f-ac6d-15cad46c7d38 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.834 182729 DEBUG nova.compute.manager [req-afa73ffb-7277-4ad4-a30a-0d6d9bb000f2 req-58edb941-eec2-4c0f-ac6d-15cad46c7d38 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Processing event network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.835 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.839 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121719.839351, 11abf2b9-1bc0-4393-b971-0ee745aa1e75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.839 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] VM Resumed (Lifecycle Event)
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.841 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.843 182729 INFO nova.virt.libvirt.driver [-] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Instance spawned successfully.
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.844 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.860 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.865 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.868 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.869 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.869 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.870 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.870 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.870 182729 DEBUG nova.virt.libvirt.driver [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:41:59 compute-0 nova_compute[182725]: 2026-01-22 22:41:59.898 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:42:00 compute-0 nova_compute[182725]: 2026-01-22 22:42:00.018 182729 INFO nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Took 5.58 seconds to spawn the instance on the hypervisor.
Jan 22 22:42:00 compute-0 nova_compute[182725]: 2026-01-22 22:42:00.019 182729 DEBUG nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:42:00 compute-0 nova_compute[182725]: 2026-01-22 22:42:00.097 182729 INFO nova.compute.manager [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Took 6.09 seconds to build instance.
Jan 22 22:42:00 compute-0 nova_compute[182725]: 2026-01-22 22:42:00.111 182729 DEBUG oslo_concurrency.lockutils [None req-3c623b07-9381-4b6d-a9f8-9e2d1ac184ab b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:00 compute-0 nova_compute[182725]: 2026-01-22 22:42:00.745 182729 DEBUG nova.network.neutron [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updated VIF entry in instance network info cache for port 452a5215-fc0f-4c85-bf69-268db34e744e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:42:00 compute-0 nova_compute[182725]: 2026-01-22 22:42:00.746 182729 DEBUG nova.network.neutron [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updating instance_info_cache with network_info: [{"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:42:00 compute-0 nova_compute[182725]: 2026-01-22 22:42:00.762 182729 DEBUG oslo_concurrency.lockutils [req-843c923a-19fe-40e4-b739-38492ddc8373 req-0adcc83c-6d0f-4aa4-ae8a-8373a58b642f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:42:01 compute-0 nova_compute[182725]: 2026-01-22 22:42:01.930 182729 DEBUG nova.compute.manager [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:42:01 compute-0 nova_compute[182725]: 2026-01-22 22:42:01.931 182729 DEBUG oslo_concurrency.lockutils [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:01 compute-0 nova_compute[182725]: 2026-01-22 22:42:01.931 182729 DEBUG oslo_concurrency.lockutils [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:01 compute-0 nova_compute[182725]: 2026-01-22 22:42:01.932 182729 DEBUG oslo_concurrency.lockutils [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:01 compute-0 nova_compute[182725]: 2026-01-22 22:42:01.932 182729 DEBUG nova.compute.manager [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] No waiting events found dispatching network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:42:01 compute-0 nova_compute[182725]: 2026-01-22 22:42:01.932 182729 WARNING nova.compute.manager [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received unexpected event network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e for instance with vm_state active and task_state None.
Jan 22 22:42:03 compute-0 nova_compute[182725]: 2026-01-22 22:42:03.202 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:04 compute-0 nova_compute[182725]: 2026-01-22 22:42:04.249 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:04 compute-0 nova_compute[182725]: 2026-01-22 22:42:04.620 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:04 compute-0 NetworkManager[54954]: <info>  [1769121724.6213] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Jan 22 22:42:04 compute-0 NetworkManager[54954]: <info>  [1769121724.6235] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Jan 22 22:42:04 compute-0 nova_compute[182725]: 2026-01-22 22:42:04.757 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:04 compute-0 ovn_controller[94850]: 2026-01-22T22:42:04Z|00559|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:04 compute-0 nova_compute[182725]: 2026-01-22 22:42:04.779 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:05 compute-0 podman[231960]: 2026-01-22 22:42:05.16431846 +0000 UTC m=+0.081745562 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Jan 22 22:42:05 compute-0 podman[231959]: 2026-01-22 22:42:05.19393859 +0000 UTC m=+0.115467475 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:42:05 compute-0 nova_compute[182725]: 2026-01-22 22:42:05.265 182729 DEBUG nova.compute.manager [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-changed-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:42:05 compute-0 nova_compute[182725]: 2026-01-22 22:42:05.265 182729 DEBUG nova.compute.manager [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Refreshing instance network info cache due to event network-changed-452a5215-fc0f-4c85-bf69-268db34e744e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:42:05 compute-0 nova_compute[182725]: 2026-01-22 22:42:05.265 182729 DEBUG oslo_concurrency.lockutils [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:42:05 compute-0 nova_compute[182725]: 2026-01-22 22:42:05.266 182729 DEBUG oslo_concurrency.lockutils [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:42:05 compute-0 nova_compute[182725]: 2026-01-22 22:42:05.266 182729 DEBUG nova.network.neutron [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Refreshing network info cache for port 452a5215-fc0f-4c85-bf69-268db34e744e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:42:06 compute-0 nova_compute[182725]: 2026-01-22 22:42:06.708 182729 DEBUG nova.network.neutron [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updated VIF entry in instance network info cache for port 452a5215-fc0f-4c85-bf69-268db34e744e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:42:06 compute-0 nova_compute[182725]: 2026-01-22 22:42:06.709 182729 DEBUG nova.network.neutron [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updating instance_info_cache with network_info: [{"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:42:06 compute-0 nova_compute[182725]: 2026-01-22 22:42:06.746 182729 DEBUG oslo_concurrency.lockutils [req-8caa58aa-e5c8-4e0e-98d4-2b5f1d8a5b76 req-54c7a348-b6d0-42cc-8f9a-1d083e350a96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:42:08 compute-0 nova_compute[182725]: 2026-01-22 22:42:08.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.113 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'name': 'tempest-TestNetworkBasicOps-server-1369105219', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ffd58948cb444c25ae034a02c0344de7', 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'hostId': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.117 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 11abf2b9-1bc0-4393-b971-0ee745aa1e75 / tap452a5215-fc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.118 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b47ca5c-a017-42cf-ba24-92b1e14d9e99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.114431', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95ca48c0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': 'bb5e7ef3f4a75e726075e22a3a8cffb6f26338a4a173176d4207b8be108775ef'}]}, 'timestamp': '2026-01-22 22:42:09.118633', '_unique_id': '3fa301fd42ac4e29b447c054da547352'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.119 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.120 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2366e08-4976-4e3f-b1a5-b25394ee0b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.120967', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95cab292-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': '87015ddc53a2b77274bb6cbb37183aeaa95e486ca01dbd5f37a9a2c6ebb0710d'}]}, 'timestamp': '2026-01-22 22:42:09.121298', '_unique_id': 'e817600d7aff48fe87f8758185045ac0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.138 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.139 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e2fa083-98a9-4eb6-9d8d-fd91d2a68994', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.123177', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95cd6690-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.780460266, 'message_signature': '9ef0841271d0f94b7aba2ce1ed7cc8794181aae716dd8fe9fb644c94bc9a579d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.123177', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95cd7d92-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.780460266, 'message_signature': '8be966df8fba942d788989eac64aa6c3ad432412e056965affcff96e92065578'}]}, 'timestamp': '2026-01-22 22:42:09.139686', '_unique_id': 'd776b26fb16d4c019c23166759fa1d51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.141 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.142 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.143 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>]
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.143 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31fc46e3-7d91-41c4-8a87-554a708adc2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.143910', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95ce37a0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': 'c890e0b76ff2a7195809ec6061b93e216462faf41e38122211ee9af5ad78225b'}]}, 'timestamp': '2026-01-22 22:42:09.144530', '_unique_id': '3daaede97c674b19b02fa491f2761c11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.145 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.147 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.169 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/cpu volume: 9030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9ba2053-848e-4605-89bf-d587df661ba4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9030000000, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'timestamp': '2026-01-22T22:42:09.147243', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '95d22fe0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.826912696, 'message_signature': '5fa53ba5d46acbe1cb6aeccfcc7e9cfcd2c9fcb684b74c78ff25c162ecf01bd1'}]}, 'timestamp': '2026-01-22 22:42:09.170452', '_unique_id': '9a978ae8db1d49cbbdc65c5d2c955190'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.171 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.172 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.172 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.172 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 11abf2b9-1bc0-4393-b971-0ee745aa1e75: ceilometer.compute.pollsters.NoVolumeException
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.199 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.200 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d14bd21-ce2e-4b92-bf03-1d25bc551ce2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.172875', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95d6bdf8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': 'e67bd6aecd1980057c79ba1b66397a23be7f0c6716d937011faa5ff8f19326cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.172875', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95d6ccd0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '61c4099922831330ac316c4c298328263827900bd9847da3a065e761643ab59e'}]}, 'timestamp': '2026-01-22 22:42:09.200616', '_unique_id': '102f9a95dd8a40b483b3c6625cd381bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.201 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.202 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.203 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a79f6769-756c-43e8-9bf6-ca71d99ea41a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.202891', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95d73490-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.780460266, 'message_signature': '530d63488e23f075bff20c40711151b980c4a342e1f9550b165e9f689f110a10'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.202891', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95d7416a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.780460266, 'message_signature': 'cd9b3ea596cf76d4705b14ac96cb5af40263977af6ed65d6c8531b9ddc7e0292'}]}, 'timestamp': '2026-01-22 22:42:09.203629', '_unique_id': 'fbc8225772f643269ce03fd1731272a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.204 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.205 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.205 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>]
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.205 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.205 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>]
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.206 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73dccb88-14af-4787-8de5-c717542f31ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.206253', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95d7b640-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': '5271481f3c9f056ca09d102e9cd3880e0c37d860aca936d17b21a90d2684d40a'}]}, 'timestamp': '2026-01-22 22:42:09.206594', '_unique_id': '69c6db572cf341d2a9b4abcae29d87e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.208 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3db47d66-afb7-40fe-be5d-892706546372', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.208220', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95d802c6-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': '3a6c9d07112b61db84efa979e63accb12ea2d3cbf10617ef83b54a4b70fc8eff'}]}, 'timestamp': '2026-01-22 22:42:09.208552', '_unique_id': 'f531bd34517b4fdd8df8ae433f0bba3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.210 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.210 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67201a06-50d0-4dd2-be7c-3554f7cae3d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.210218', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95d850be-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.780460266, 'message_signature': '74d3dd28612119e5cd59a1756565fbc19b45d9a4c888062a485cff2c340394eb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.210218', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95d85c44-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.780460266, 'message_signature': 'f53104fb936931606d2daf48128d748284d1ff9fb78f0dca8a1684580c661ce1'}]}, 'timestamp': '2026-01-22 22:42:09.210847', '_unique_id': '4d2c1577559d4ab0a8ec7e0082c2735a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.211 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.212 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.read.latency volume: 108206062 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.212 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.read.latency volume: 658827 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fd37cf2-2d2f-4b75-aacf-b6c526dec3e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 108206062, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.212506', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95d8aa46-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '29aa134d532fdea31c813cca829e46fc4f5673d10cb3a648b2d3876a9ead0995'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 658827, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.212506', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95d8b81a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': 'c93efd5b1499c3a81de8f52ca3dddf827b0661a45ee0604d245588b8627f1c87'}]}, 'timestamp': '2026-01-22 22:42:09.213183', '_unique_id': '28908ab1746e4eb7ab9fbc1bce0dc478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.213 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.214 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.215 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5001f14-86b1-4ee7-ae5a-1107b01a72e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.214883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95d908d8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '360eac8e3e4ff7589ebd880c0bddb8a9b018ef5c60efe17f5bd0e10ce24cb892'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.214883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95d916ca-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '00cefdf585ecaf1feb7d79cc9f65ba97f8f0419340486a4271743a05ba1fae29'}]}, 'timestamp': '2026-01-22 22:42:09.215605', '_unique_id': 'a9c54660c31249c7b608e5aa7765115b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.216 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.217 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5691b0b-118b-41db-89be-f0303843e454', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.217442', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95d96b16-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': 'd32858fb30497266208b0bd528ff4e230b969282dc93494faeb35970c0bce41a'}]}, 'timestamp': '2026-01-22 22:42:09.217857', '_unique_id': 'cfcef6302048418e894a0514a7838f1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.218 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.219 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.219 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '608d91da-c5f5-4d80-932b-6ba4f5c22a6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.219590', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95d9bf8a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '4b7268d9e260ec7399b6067fd0c47d64a38f427ed898628dbc7820a9439944e1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.219590', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95d9cd4a-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '388e68413df6e5e3efc97b8fb3f118032f6a478e9c350ebe8798aa8b110079d0'}]}, 'timestamp': '2026-01-22 22:42:09.220294', '_unique_id': '98686da73eb14a768dd5ad9f7d477834'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.222 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.222 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdd61d3f-5b65-479d-ae73-d3242ef877f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.222061', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95da1f70-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': 'e84f91437059423b49e28cf69bf4d5385a846e876a08f85fcf8e86d88d2b0971'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.222061', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95da2f06-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '414c963bda4f57ad768f3419cb9eef80d2cbce956e9323dee63b136e75426f52'}]}, 'timestamp': '2026-01-22 22:42:09.222834', '_unique_id': '7f55fdc519b84b1c8534d65736585274'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.223 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.224 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6665152c-a883-457c-991a-53d7314dd399', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.224488', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95da7e66-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': 'e39b66db8fd51e7215008281fa05adf7468dcde8ce7a87f14bbb6e7f2bfa227c'}]}, 'timestamp': '2026-01-22 22:42:09.224852', '_unique_id': 'fc88094130ba412da6a3bf06b698974f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.225 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.226 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e58340e-6f6d-4d79-9949-de32476e5ad9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.226607', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95dad384-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': '5ae3a161dff554d71c568f342b66c218bdce1d6f4b7e8bf2d7ae018392eee18b'}]}, 'timestamp': '2026-01-22 22:42:09.227006', '_unique_id': '2171e1762bb7490b996212729ac0c98f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.227 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.228 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.228 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1369105219>]
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.229 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6b197cd-5d4f-4f09-9b0a-e99458eb56e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.229212', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95db36d0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': '138d600569a1d3cc4bc2895e3e7c4bc927087c5a4f3456e88719fe4b3f4fb48b'}]}, 'timestamp': '2026-01-22 22:42:09.229543', '_unique_id': '880f32551f4547d894ed97a51f5a73a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.230 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.231 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d4c9f20-dadc-4c97-87a6-0459aa78dd18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-0000008a-11abf2b9-1bc0-4393-b971-0ee745aa1e75-tap452a5215-fc', 'timestamp': '2026-01-22T22:42:09.231248', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'tap452a5215-fc', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:b5:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap452a5215-fc'}, 'message_id': '95db8752-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.771698847, 'message_signature': 'ecc7520dec2b061846b789cffc95649879e53e645eec77bdd3704f707193e224'}]}, 'timestamp': '2026-01-22 22:42:09.231607', '_unique_id': 'f229d000e9d04c1faf914fd70cf7b4c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.232 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.233 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.233 12 DEBUG ceilometer.compute.pollsters [-] 11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68a9eab2-a9e2-4020-87f8-ec9528aa9a8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-vda', 'timestamp': '2026-01-22T22:42:09.233421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95dbdb30-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': '8513964699c29a3996c62bac25c001a139dff04946b0ca2852625ba12d10353c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75-sda', 'timestamp': '2026-01-22T22:42:09.233421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1369105219', 'name': 'instance-0000008a', 'instance_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'instance_type': 'm1.nano', 'host': '39f4816d83643b2776f7b01d7682b058af908ab4ba71066c17f4cdd4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95dbea26-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5366.830141496, 'message_signature': 'b23ac6e3fffd4034aa22f9842afe81f467ff9a9b598939f5535c386eabf5119b'}]}, 'timestamp': '2026-01-22 22:42:09.234116', '_unique_id': 'de38b8c0160f4798ae969a4ca0000030'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:42:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:42:09.234 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:42:09 compute-0 nova_compute[182725]: 2026-01-22 22:42:09.250 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:12 compute-0 ovn_controller[94850]: 2026-01-22T22:42:12Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:b5:16 10.100.0.12
Jan 22 22:42:12 compute-0 ovn_controller[94850]: 2026-01-22T22:42:12Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:b5:16 10.100.0.12
Jan 22 22:42:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:12.450 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:12.451 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:12.452 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:13 compute-0 ovn_controller[94850]: 2026-01-22T22:42:13Z|00560|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:13 compute-0 nova_compute[182725]: 2026-01-22 22:42:13.210 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:13 compute-0 nova_compute[182725]: 2026-01-22 22:42:13.229 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:14 compute-0 nova_compute[182725]: 2026-01-22 22:42:14.252 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:17 compute-0 podman[232025]: 2026-01-22 22:42:17.114451193 +0000 UTC m=+0.047905178 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:42:17 compute-0 podman[232027]: 2026-01-22 22:42:17.130480233 +0000 UTC m=+0.057060626 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:42:17 compute-0 podman[232026]: 2026-01-22 22:42:17.143539129 +0000 UTC m=+0.067928057 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:42:17 compute-0 nova_compute[182725]: 2026-01-22 22:42:17.913 182729 INFO nova.compute.manager [None req-de43a78e-96ac-431f-a4c4-86fd855ae3cb b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Get console output
Jan 22 22:42:17 compute-0 nova_compute[182725]: 2026-01-22 22:42:17.920 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:42:18 compute-0 nova_compute[182725]: 2026-01-22 22:42:18.214 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:19 compute-0 nova_compute[182725]: 2026-01-22 22:42:19.255 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:20 compute-0 nova_compute[182725]: 2026-01-22 22:42:20.103 182729 DEBUG nova.compute.manager [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-changed-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:42:20 compute-0 nova_compute[182725]: 2026-01-22 22:42:20.104 182729 DEBUG nova.compute.manager [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Refreshing instance network info cache due to event network-changed-452a5215-fc0f-4c85-bf69-268db34e744e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:42:20 compute-0 nova_compute[182725]: 2026-01-22 22:42:20.105 182729 DEBUG oslo_concurrency.lockutils [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:42:20 compute-0 nova_compute[182725]: 2026-01-22 22:42:20.105 182729 DEBUG oslo_concurrency.lockutils [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:42:20 compute-0 nova_compute[182725]: 2026-01-22 22:42:20.105 182729 DEBUG nova.network.neutron [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Refreshing network info cache for port 452a5215-fc0f-4c85-bf69-268db34e744e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:42:20 compute-0 nova_compute[182725]: 2026-01-22 22:42:20.900 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:23 compute-0 nova_compute[182725]: 2026-01-22 22:42:23.217 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:23 compute-0 nova_compute[182725]: 2026-01-22 22:42:23.691 182729 DEBUG nova.network.neutron [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updated VIF entry in instance network info cache for port 452a5215-fc0f-4c85-bf69-268db34e744e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:42:23 compute-0 nova_compute[182725]: 2026-01-22 22:42:23.691 182729 DEBUG nova.network.neutron [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updating instance_info_cache with network_info: [{"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:42:23 compute-0 nova_compute[182725]: 2026-01-22 22:42:23.721 182729 DEBUG oslo_concurrency.lockutils [req-22a54ca4-0305-42dd-8e42-10f15dfa9ec6 req-9a3635cd-2c2c-4e47-b860-4a85c9e8d59c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:42:24 compute-0 nova_compute[182725]: 2026-01-22 22:42:24.256 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:24 compute-0 ovn_controller[94850]: 2026-01-22T22:42:24Z|00561|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:24 compute-0 nova_compute[182725]: 2026-01-22 22:42:24.918 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:25 compute-0 ovn_controller[94850]: 2026-01-22T22:42:25Z|00562|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:25 compute-0 nova_compute[182725]: 2026-01-22 22:42:25.028 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:27 compute-0 nova_compute[182725]: 2026-01-22 22:42:27.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:27 compute-0 nova_compute[182725]: 2026-01-22 22:42:27.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:42:27 compute-0 nova_compute[182725]: 2026-01-22 22:42:27.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:42:28 compute-0 nova_compute[182725]: 2026-01-22 22:42:28.223 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:28 compute-0 nova_compute[182725]: 2026-01-22 22:42:28.230 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:42:28 compute-0 nova_compute[182725]: 2026-01-22 22:42:28.231 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:42:28 compute-0 nova_compute[182725]: 2026-01-22 22:42:28.231 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:42:28 compute-0 nova_compute[182725]: 2026-01-22 22:42:28.232 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 11abf2b9-1bc0-4393-b971-0ee745aa1e75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:42:29 compute-0 nova_compute[182725]: 2026-01-22 22:42:29.258 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:29 compute-0 nova_compute[182725]: 2026-01-22 22:42:29.953 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updating instance_info_cache with network_info: [{"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:42:29 compute-0 nova_compute[182725]: 2026-01-22 22:42:29.975 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-11abf2b9-1bc0-4393-b971-0ee745aa1e75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:42:29 compute-0 nova_compute[182725]: 2026-01-22 22:42:29.976 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:42:29 compute-0 nova_compute[182725]: 2026-01-22 22:42:29.977 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:29 compute-0 nova_compute[182725]: 2026-01-22 22:42:29.978 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.008 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.008 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.009 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.009 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.113 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:42:30 compute-0 podman[232088]: 2026-01-22 22:42:30.131454585 +0000 UTC m=+0.073071936 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.181 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.183 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.252 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.412 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.413 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5517MB free_disk=73.30400848388672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.413 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.413 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.485 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 11abf2b9-1bc0-4393-b971-0ee745aa1e75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.486 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.486 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.526 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.549 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.574 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:42:30 compute-0 nova_compute[182725]: 2026-01-22 22:42:30.574 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:32 compute-0 nova_compute[182725]: 2026-01-22 22:42:32.486 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:32 compute-0 nova_compute[182725]: 2026-01-22 22:42:32.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:33 compute-0 nova_compute[182725]: 2026-01-22 22:42:33.225 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:34 compute-0 nova_compute[182725]: 2026-01-22 22:42:34.260 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:34 compute-0 nova_compute[182725]: 2026-01-22 22:42:34.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:34 compute-0 nova_compute[182725]: 2026-01-22 22:42:34.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:34 compute-0 nova_compute[182725]: 2026-01-22 22:42:34.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:42:36 compute-0 podman[232116]: 2026-01-22 22:42:36.124750907 +0000 UTC m=+0.060242575 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Jan 22 22:42:36 compute-0 podman[232115]: 2026-01-22 22:42:36.145590358 +0000 UTC m=+0.084631855 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:42:37 compute-0 nova_compute[182725]: 2026-01-22 22:42:37.616 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:37 compute-0 NetworkManager[54954]: <info>  [1769121757.6170] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Jan 22 22:42:37 compute-0 NetworkManager[54954]: <info>  [1769121757.6179] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 22 22:42:37 compute-0 nova_compute[182725]: 2026-01-22 22:42:37.752 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:37 compute-0 ovn_controller[94850]: 2026-01-22T22:42:37Z|00563|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:37 compute-0 nova_compute[182725]: 2026-01-22 22:42:37.764 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:38 compute-0 nova_compute[182725]: 2026-01-22 22:42:38.227 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:38 compute-0 nova_compute[182725]: 2026-01-22 22:42:38.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:42:39 compute-0 nova_compute[182725]: 2026-01-22 22:42:39.262 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:39 compute-0 nova_compute[182725]: 2026-01-22 22:42:39.893 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:40.661 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:42:40 compute-0 nova_compute[182725]: 2026-01-22 22:42:40.662 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:40.662 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:42:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:41.665 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:42:41 compute-0 nova_compute[182725]: 2026-01-22 22:42:41.751 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:43.152 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:be:97 10.100.0.2 2001:db8::f816:3eff:fe2c:be97'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:be97/64', 'neutron:device_id': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b64fd7f9-9daa-4dd2-9dfa-7c863399e516, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4c584d5e-ac75-444a-b20c-05a59b075ca2) old=Port_Binding(mac=['fa:16:3e:2c:be:97 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:42:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:43.154 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4c584d5e-ac75-444a-b20c-05a59b075ca2 in datapath 3676296d-a568-47ea-b6cb-2ef8aff27f14 updated
Jan 22 22:42:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:43.155 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3676296d-a568-47ea-b6cb-2ef8aff27f14, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:42:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:43.156 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b754d3cd-bd9c-4523-a583-88f68105fd99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:43 compute-0 nova_compute[182725]: 2026-01-22 22:42:43.231 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:44 compute-0 nova_compute[182725]: 2026-01-22 22:42:44.263 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:48 compute-0 podman[232161]: 2026-01-22 22:42:48.112543537 +0000 UTC m=+0.051615480 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:42:48 compute-0 podman[232163]: 2026-01-22 22:42:48.115803118 +0000 UTC m=+0.048180884 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:42:48 compute-0 podman[232162]: 2026-01-22 22:42:48.146659359 +0000 UTC m=+0.081917317 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 22:42:48 compute-0 nova_compute[182725]: 2026-01-22 22:42:48.234 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:49 compute-0 nova_compute[182725]: 2026-01-22 22:42:49.264 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:53 compute-0 nova_compute[182725]: 2026-01-22 22:42:53.238 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:54 compute-0 nova_compute[182725]: 2026-01-22 22:42:54.266 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:54 compute-0 ovn_controller[94850]: 2026-01-22T22:42:54Z|00564|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:54 compute-0 nova_compute[182725]: 2026-01-22 22:42:54.484 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:54 compute-0 ovn_controller[94850]: 2026-01-22T22:42:54Z|00565|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:54 compute-0 nova_compute[182725]: 2026-01-22 22:42:54.630 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.034 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.035 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.035 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.035 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.036 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.048 182729 INFO nova.compute.manager [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Terminating instance
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.060 182729 DEBUG nova.compute.manager [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:42:56 compute-0 kernel: tap452a5215-fc (unregistering): left promiscuous mode
Jan 22 22:42:56 compute-0 NetworkManager[54954]: <info>  [1769121776.0830] device (tap452a5215-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.089 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 ovn_controller[94850]: 2026-01-22T22:42:56Z|00566|binding|INFO|Releasing lport 452a5215-fc0f-4c85-bf69-268db34e744e from this chassis (sb_readonly=0)
Jan 22 22:42:56 compute-0 ovn_controller[94850]: 2026-01-22T22:42:56Z|00567|binding|INFO|Setting lport 452a5215-fc0f-4c85-bf69-268db34e744e down in Southbound
Jan 22 22:42:56 compute-0 ovn_controller[94850]: 2026-01-22T22:42:56Z|00568|binding|INFO|Removing iface tap452a5215-fc ovn-installed in OVS
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.092 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 ovn_controller[94850]: 2026-01-22T22:42:56Z|00569|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.099 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:b5:16 10.100.0.12'], port_security=['fa:16:3e:d3:b5:16 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '11abf2b9-1bc0-4393-b971-0ee745aa1e75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '81c9bc76-4ce6-41d9-8955-8e38f4f633b8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ebf5952-91d3-4d6e-a145-1401e7d14d3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=452a5215-fc0f-4c85-bf69-268db34e744e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.100 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 452a5215-fc0f-4c85-bf69-268db34e744e in datapath fd739554-520e-4e70-9045-bd1e5e1f0fe0 unbound from our chassis
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.102 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd739554-520e-4e70-9045-bd1e5e1f0fe0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.102 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a0656c83-760c-4356-b6a7-1deea3b505e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.103 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 namespace which is not needed anymore
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.108 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.153 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008a.scope: Deactivated successfully.
Jan 22 22:42:56 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008a.scope: Consumed 14.608s CPU time.
Jan 22 22:42:56 compute-0 systemd-machined[154006]: Machine qemu-63-instance-0000008a terminated.
Jan 22 22:42:56 compute-0 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[231927]: [NOTICE]   (231946) : haproxy version is 2.8.14-c23fe91
Jan 22 22:42:56 compute-0 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[231927]: [NOTICE]   (231946) : path to executable is /usr/sbin/haproxy
Jan 22 22:42:56 compute-0 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[231927]: [WARNING]  (231946) : Exiting Master process...
Jan 22 22:42:56 compute-0 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[231927]: [ALERT]    (231946) : Current worker (231949) exited with code 143 (Terminated)
Jan 22 22:42:56 compute-0 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[231927]: [WARNING]  (231946) : All workers exited. Exiting... (0)
Jan 22 22:42:56 compute-0 systemd[1]: libpod-5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27.scope: Deactivated successfully.
Jan 22 22:42:56 compute-0 podman[232250]: 2026-01-22 22:42:56.224006805 +0000 UTC m=+0.045542628 container died 5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:42:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27-userdata-shm.mount: Deactivated successfully.
Jan 22 22:42:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-a8194613fe4bd533b801d6be8e99fed8bf214494fba33763c0c84a87f304b603-merged.mount: Deactivated successfully.
Jan 22 22:42:56 compute-0 podman[232250]: 2026-01-22 22:42:56.260486476 +0000 UTC m=+0.082022289 container cleanup 5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 22:42:56 compute-0 systemd[1]: libpod-conmon-5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27.scope: Deactivated successfully.
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.318 182729 INFO nova.virt.libvirt.driver [-] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Instance destroyed successfully.
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.319 182729 DEBUG nova.objects.instance [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid 11abf2b9-1bc0-4393-b971-0ee745aa1e75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:42:56 compute-0 podman[232279]: 2026-01-22 22:42:56.322168907 +0000 UTC m=+0.041078647 container remove 5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.327 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c630cc-ed7e-43bb-8049-5b429678dfb4]: (4, ('Thu Jan 22 10:42:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 (5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27)\n5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27\nThu Jan 22 10:42:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 (5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27)\n5f543dda6f17669124b009d0fd85428b5c02197781dcdc70f8ddc0cc0aef9a27\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.329 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d9159e17-2bbc-4224-894d-4cf67b752404]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.330 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd739554-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.331 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 kernel: tapfd739554-50: left promiscuous mode
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.334 182729 DEBUG nova.virt.libvirt.vif [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:41:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1369105219',display_name='tempest-TestNetworkBasicOps-server-1369105219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1369105219',id=138,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxvkzC9+TWiN5a2/sBgLprzCKD83Ww20/NvZIfxZvllzRwt6EzCq/7AQIXOMtNpn7QbLHYbNMDD7D0HxXG2533204Rxhwicpz/mT/IG8L6DsmSrpd3kkJgYuN5LW6KGMw==',key_name='tempest-TestNetworkBasicOps-815550840',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:42:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-ku7wikjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:42:00Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=11abf2b9-1bc0-4393-b971-0ee745aa1e75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.334 182729 DEBUG nova.network.os_vif_util [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "452a5215-fc0f-4c85-bf69-268db34e744e", "address": "fa:16:3e:d3:b5:16", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap452a5215-fc", "ovs_interfaceid": "452a5215-fc0f-4c85-bf69-268db34e744e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.335 182729 DEBUG nova.network.os_vif_util [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:b5:16,bridge_name='br-int',has_traffic_filtering=True,id=452a5215-fc0f-4c85-bf69-268db34e744e,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap452a5215-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.336 182729 DEBUG os_vif [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:b5:16,bridge_name='br-int',has_traffic_filtering=True,id=452a5215-fc0f-4c85-bf69-268db34e744e,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap452a5215-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.337 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.337 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap452a5215-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.338 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.340 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.345 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.346 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.348 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9e9bb9-73a3-4766-9d0d-125eeb4a2b92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.349 182729 INFO os_vif [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:b5:16,bridge_name='br-int',has_traffic_filtering=True,id=452a5215-fc0f-4c85-bf69-268db34e744e,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap452a5215-fc')
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.350 182729 INFO nova.virt.libvirt.driver [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Deleting instance files /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75_del
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.350 182729 INFO nova.virt.libvirt.driver [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Deletion of /var/lib/nova/instances/11abf2b9-1bc0-4393-b971-0ee745aa1e75_del complete
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.364 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bd5830a0-5c60-4ec7-9dc0-2e5f2c1b7726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.365 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[629e4cd5-dc5d-4bb8-b69f-262ada5bff92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.380 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[63956d8d-3100-4458-94e4-bee7569ae69a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535660, 'reachable_time': 26249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232311, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dfd739554\x2d520e\x2d4e70\x2d9045\x2dbd1e5e1f0fe0.mount: Deactivated successfully.
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.385 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:42:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:42:56.385 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6972d6-e4fd-4638-8446-d4d45ac92d02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.423 182729 INFO nova.compute.manager [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.424 182729 DEBUG oslo.service.loopingcall [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.424 182729 DEBUG nova.compute.manager [-] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:42:56 compute-0 nova_compute[182725]: 2026-01-22 22:42:56.425 182729 DEBUG nova.network.neutron [-] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.494 182729 DEBUG nova.network.neutron [-] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.520 182729 INFO nova.compute.manager [-] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Took 1.10 seconds to deallocate network for instance.
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.582 182729 DEBUG nova.compute.manager [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-vif-unplugged-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.583 182729 DEBUG oslo_concurrency.lockutils [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.583 182729 DEBUG oslo_concurrency.lockutils [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.584 182729 DEBUG oslo_concurrency.lockutils [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.585 182729 DEBUG nova.compute.manager [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] No waiting events found dispatching network-vif-unplugged-452a5215-fc0f-4c85-bf69-268db34e744e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.585 182729 DEBUG nova.compute.manager [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-vif-unplugged-452a5215-fc0f-4c85-bf69-268db34e744e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.585 182729 DEBUG nova.compute.manager [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.586 182729 DEBUG oslo_concurrency.lockutils [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.587 182729 DEBUG oslo_concurrency.lockutils [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.587 182729 DEBUG oslo_concurrency.lockutils [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.588 182729 DEBUG nova.compute.manager [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] No waiting events found dispatching network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.588 182729 WARNING nova.compute.manager [req-4cbef12e-694a-4d04-9e13-444e27742571 req-31654efa-f241-4721-a600-b54ad676b3c3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received unexpected event network-vif-plugged-452a5215-fc0f-4c85-bf69-268db34e744e for instance with vm_state active and task_state deleting.
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.599 182729 DEBUG nova.compute.manager [req-19d35b09-c0e8-400e-a554-f64d98ad22a2 req-0d6fc2e1-5d35-4450-bb00-cf8080098422 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Received event network-vif-deleted-452a5215-fc0f-4c85-bf69-268db34e744e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.617 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.618 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.696 182729 DEBUG nova.compute.provider_tree [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.710 182729 DEBUG nova.scheduler.client.report [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.747 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.793 182729 INFO nova.scheduler.client.report [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance 11abf2b9-1bc0-4393-b971-0ee745aa1e75
Jan 22 22:42:57 compute-0 nova_compute[182725]: 2026-01-22 22:42:57.889 182729 DEBUG oslo_concurrency.lockutils [None req-235ed01d-4fbf-4dfe-9bd2-cbf278e16a6b b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "11abf2b9-1bc0-4393-b971-0ee745aa1e75" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:42:59 compute-0 nova_compute[182725]: 2026-01-22 22:42:59.268 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:01 compute-0 podman[232312]: 2026-01-22 22:43:01.146832364 +0000 UTC m=+0.084114942 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 22 22:43:01 compute-0 nova_compute[182725]: 2026-01-22 22:43:01.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:04 compute-0 nova_compute[182725]: 2026-01-22 22:43:04.271 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:06 compute-0 nova_compute[182725]: 2026-01-22 22:43:06.340 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:07 compute-0 podman[232334]: 2026-01-22 22:43:07.134140563 +0000 UTC m=+0.068800229 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 22 22:43:07 compute-0 podman[232333]: 2026-01-22 22:43:07.142297807 +0000 UTC m=+0.082633904 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 22:43:09 compute-0 nova_compute[182725]: 2026-01-22 22:43:09.272 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:11 compute-0 nova_compute[182725]: 2026-01-22 22:43:11.318 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121776.3169737, 11abf2b9-1bc0-4393-b971-0ee745aa1e75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:11 compute-0 nova_compute[182725]: 2026-01-22 22:43:11.318 182729 INFO nova.compute.manager [-] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] VM Stopped (Lifecycle Event)
Jan 22 22:43:11 compute-0 nova_compute[182725]: 2026-01-22 22:43:11.335 182729 DEBUG nova.compute.manager [None req-34c4004a-de13-40d7-9f36-ad9558646ca0 - - - - - -] [instance: 11abf2b9-1bc0-4393-b971-0ee745aa1e75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:11 compute-0 nova_compute[182725]: 2026-01-22 22:43:11.341 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.442 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.442 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:12.451 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:12.451 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:12.452 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.459 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.562 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.563 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.567 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.567 182729 INFO nova.compute.claims [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.703 182729 DEBUG nova.compute.provider_tree [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.727 182729 DEBUG nova.scheduler.client.report [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.755 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.755 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.828 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.828 182729 DEBUG nova.network.neutron [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.860 182729 INFO nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.878 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.997 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.998 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.998 182729 INFO nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Creating image(s)
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.999 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:12 compute-0 nova_compute[182725]: 2026-01-22 22:43:12.999 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.000 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.011 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.037 182729 DEBUG nova.policy [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '80fc173d19874dafa5e0cbd18c7ccf24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839eb51e89b14157b8da40ae1b480ef3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.097 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.098 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.098 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.108 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.165 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.166 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.198 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.199 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.199 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.253 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.254 182729 DEBUG nova.virt.disk.api [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.254 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.341 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.342 182729 DEBUG nova.virt.disk.api [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.343 182729 DEBUG nova.objects.instance [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.361 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.361 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Ensure instance console log exists: /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.361 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.362 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:13 compute-0 nova_compute[182725]: 2026-01-22 22:43:13.362 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:14 compute-0 nova_compute[182725]: 2026-01-22 22:43:14.275 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:14 compute-0 nova_compute[182725]: 2026-01-22 22:43:14.747 182729 DEBUG nova.network.neutron [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Successfully created port: 77888806-9b6a-4b3d-a528-863c5c5801a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.799 182729 DEBUG nova.network.neutron [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Successfully updated port: 77888806-9b6a-4b3d-a528-863c5c5801a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.820 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.821 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.822 182729 DEBUG nova.network.neutron [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.900 182729 DEBUG nova.compute.manager [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-changed-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.901 182729 DEBUG nova.compute.manager [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Refreshing instance network info cache due to event network-changed-77888806-9b6a-4b3d-a528-863c5c5801a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.902 182729 DEBUG oslo_concurrency.lockutils [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:43:15 compute-0 nova_compute[182725]: 2026-01-22 22:43:15.998 182729 DEBUG nova.network.neutron [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.343 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.898 182729 DEBUG nova.network.neutron [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updating instance_info_cache with network_info: [{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.915 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.916 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance network_info: |[{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.917 182729 DEBUG oslo_concurrency.lockutils [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.917 182729 DEBUG nova.network.neutron [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Refreshing network info cache for port 77888806-9b6a-4b3d-a528-863c5c5801a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.922 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Start _get_guest_xml network_info=[{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.932 182729 WARNING nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.944 182729 DEBUG nova.virt.libvirt.host [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.945 182729 DEBUG nova.virt.libvirt.host [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.949 182729 DEBUG nova.virt.libvirt.host [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.950 182729 DEBUG nova.virt.libvirt.host [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.952 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.953 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.954 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.954 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.955 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.955 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.956 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.956 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.957 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.957 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.958 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.959 182729 DEBUG nova.virt.hardware [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.966 182729 DEBUG nova.virt.libvirt.vif [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-603441666',display_name='tempest-TestNetworkAdvancedServerOps-server-603441666',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-603441666',id=144,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLleIPWkzqQAYXdinIykbPmLUKqTX3mJMUySjMVg8PNK1kTsfLhFOM601y1SwcipFXR2ooPQMR4c2AlBekdCT6xxXuikuCWwJ9GvoQJa8Ou8p3ZSyWYBW16vokmhFqzdnw==',key_name='tempest-TestNetworkAdvancedServerOps-658154535',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-p2g7nu2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:12Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=1e7db515-c991-4967-b53b-01c33eaadab2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.967 182729 DEBUG nova.network.os_vif_util [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.968 182729 DEBUG nova.network.os_vif_util [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.970 182729 DEBUG nova.objects.instance [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.989 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <uuid>1e7db515-c991-4967-b53b-01c33eaadab2</uuid>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <name>instance-00000090</name>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-603441666</nova:name>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:43:16</nova:creationTime>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         <nova:port uuid="77888806-9b6a-4b3d-a528-863c5c5801a7">
Jan 22 22:43:16 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <system>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <entry name="serial">1e7db515-c991-4967-b53b-01c33eaadab2</entry>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <entry name="uuid">1e7db515-c991-4967-b53b-01c33eaadab2</entry>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </system>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <os>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   </os>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <features>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   </features>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.config"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:2e:34:7d"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <target dev="tap77888806-9b"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/console.log" append="off"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <video>
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </video>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:43:16 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:43:16 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:43:16 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:43:16 compute-0 nova_compute[182725]: </domain>
Jan 22 22:43:16 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.990 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Preparing to wait for external event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.990 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.990 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.991 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.992 182729 DEBUG nova.virt.libvirt.vif [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-603441666',display_name='tempest-TestNetworkAdvancedServerOps-server-603441666',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-603441666',id=144,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLleIPWkzqQAYXdinIykbPmLUKqTX3mJMUySjMVg8PNK1kTsfLhFOM601y1SwcipFXR2ooPQMR4c2AlBekdCT6xxXuikuCWwJ9GvoQJa8Ou8p3ZSyWYBW16vokmhFqzdnw==',key_name='tempest-TestNetworkAdvancedServerOps-658154535',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-p2g7nu2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:12Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=1e7db515-c991-4967-b53b-01c33eaadab2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.992 182729 DEBUG nova.network.os_vif_util [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.993 182729 DEBUG nova.network.os_vif_util [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.995 182729 DEBUG os_vif [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.996 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.996 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:16 compute-0 nova_compute[182725]: 2026-01-22 22:43:16.997 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.000 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.001 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77888806-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.001 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77888806-9b, col_values=(('external_ids', {'iface-id': '77888806-9b6a-4b3d-a528-863c5c5801a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:34:7d', 'vm-uuid': '1e7db515-c991-4967-b53b-01c33eaadab2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.003 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 NetworkManager[54954]: <info>  [1769121797.0044] manager: (tap77888806-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.006 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.009 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.011 182729 INFO os_vif [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b')
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.064 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.065 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.066 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No VIF found with MAC fa:16:3e:2e:34:7d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.067 182729 INFO nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Using config drive
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.420 182729 INFO nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Creating config drive at /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.config
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.428 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzhv6aiug execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.557 182729 DEBUG oslo_concurrency.processutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzhv6aiug" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:17 compute-0 kernel: tap77888806-9b: entered promiscuous mode
Jan 22 22:43:17 compute-0 ovn_controller[94850]: 2026-01-22T22:43:17Z|00570|binding|INFO|Claiming lport 77888806-9b6a-4b3d-a528-863c5c5801a7 for this chassis.
Jan 22 22:43:17 compute-0 ovn_controller[94850]: 2026-01-22T22:43:17Z|00571|binding|INFO|77888806-9b6a-4b3d-a528-863c5c5801a7: Claiming fa:16:3e:2e:34:7d 10.100.0.8
Jan 22 22:43:17 compute-0 NetworkManager[54954]: <info>  [1769121797.6440] manager: (tap77888806-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.644 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.648 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.681 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:34:7d 10.100.0.8'], port_security=['fa:16:3e:2e:34:7d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e7db515-c991-4967-b53b-01c33eaadab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c949eca9-cdeb-4643-865a-57a458362392', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5caa45f3-4398-48b7-91ee-ad81d3db5e28, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=77888806-9b6a-4b3d-a528-863c5c5801a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.684 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 77888806-9b6a-4b3d-a528-863c5c5801a7 in datapath 20c3083d-5059-4bbb-a1bc-ca13d504e79c bound to our chassis
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.686 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20c3083d-5059-4bbb-a1bc-ca13d504e79c
Jan 22 22:43:17 compute-0 systemd-udevd[232410]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.704 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2161485c-bec0-449d-ba91-2b35a209fb82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.705 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20c3083d-51 in ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:43:17 compute-0 NetworkManager[54954]: <info>  [1769121797.7101] device (tap77888806-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.708 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20c3083d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.708 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f66e9669-f86b-4fee-8efa-b1c6f48b58e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.710 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f671661c-3216-4707-925c-4de3fd87bbf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 NetworkManager[54954]: <info>  [1769121797.7119] device (tap77888806-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:43:17 compute-0 systemd-machined[154006]: New machine qemu-64-instance-00000090.
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.728 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[b83402ce-b585-443f-b2a9-3379fcaa998a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.739 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 ovn_controller[94850]: 2026-01-22T22:43:17Z|00572|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 ovn-installed in OVS
Jan 22 22:43:17 compute-0 ovn_controller[94850]: 2026-01-22T22:43:17Z|00573|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 up in Southbound
Jan 22 22:43:17 compute-0 nova_compute[182725]: 2026-01-22 22:43:17.748 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.749 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[54e572d6-15b2-4d07-ac15-b100273c29d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-00000090.
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.786 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e9316bc6-8594-4fb7-b6f8-2f20288b385f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.791 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea5bb45-1c94-4298-a670-906bca40cfb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 NetworkManager[54954]: <info>  [1769121797.7940] manager: (tap20c3083d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/265)
Jan 22 22:43:17 compute-0 systemd-udevd[232415]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.828 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe98b70-83aa-4746-8005-13cee73d4e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.832 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef008c6-1929-4a77-9bb2-07c3da197600]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 NetworkManager[54954]: <info>  [1769121797.8547] device (tap20c3083d-50): carrier: link connected
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.861 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[bab7c220-1abb-46f0-9789-32e1a39ae664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.880 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c452b59b-f984-43d5-b230-c5b41882f8bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20c3083d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:a9:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543545, 'reachable_time': 20768, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232445, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.895 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[44bcf1bf-8057-484a-bda9-e303b8caadb8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe87:a9b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 543545, 'tstamp': 543545}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232446, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.909 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[70fff49d-64f9-4068-b415-14d119bba488]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20c3083d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:a9:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543545, 'reachable_time': 20768, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232447, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:17.943 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7b65f027-eeda-46bd-a340-21eaac634b5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.009 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[81a6e767-f897-49ad-8e50-d72197a3f7ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.011 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20c3083d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.011 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.012 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20c3083d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:18 compute-0 NetworkManager[54954]: <info>  [1769121798.0145] manager: (tap20c3083d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 22 22:43:18 compute-0 kernel: tap20c3083d-50: entered promiscuous mode
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.013 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.017 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20c3083d-50, col_values=(('external_ids', {'iface-id': '1da0bccc-bb1b-406e-9858-43a31781159d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:18 compute-0 ovn_controller[94850]: 2026-01-22T22:43:18Z|00574|binding|INFO|Releasing lport 1da0bccc-bb1b-406e-9858-43a31781159d from this chassis (sb_readonly=0)
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.022 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20c3083d-5059-4bbb-a1bc-ca13d504e79c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20c3083d-5059-4bbb-a1bc-ca13d504e79c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.022 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.023 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ab426da9-e179-4eec-ac60-1789e99118b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.023 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-20c3083d-5059-4bbb-a1bc-ca13d504e79c
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/20c3083d-5059-4bbb-a1bc-ca13d504e79c.pid.haproxy
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 20c3083d-5059-4bbb-a1bc-ca13d504e79c
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:43:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:18.025 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'env', 'PROCESS_TAG=haproxy-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20c3083d-5059-4bbb-a1bc-ca13d504e79c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.039 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.136 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121798.135483, 1e7db515-c991-4967-b53b-01c33eaadab2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.136 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] VM Started (Lifecycle Event)
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.161 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.165 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121798.135906, 1e7db515-c991-4967-b53b-01c33eaadab2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.165 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] VM Paused (Lifecycle Event)
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.186 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.189 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.212 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.329 182729 DEBUG nova.compute.manager [req-0cec728b-a6be-4a44-81a1-a267f487913c req-fb36647f-994e-4b05-b17a-6a47024501ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.330 182729 DEBUG oslo_concurrency.lockutils [req-0cec728b-a6be-4a44-81a1-a267f487913c req-fb36647f-994e-4b05-b17a-6a47024501ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.330 182729 DEBUG oslo_concurrency.lockutils [req-0cec728b-a6be-4a44-81a1-a267f487913c req-fb36647f-994e-4b05-b17a-6a47024501ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.331 182729 DEBUG oslo_concurrency.lockutils [req-0cec728b-a6be-4a44-81a1-a267f487913c req-fb36647f-994e-4b05-b17a-6a47024501ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.331 182729 DEBUG nova.compute.manager [req-0cec728b-a6be-4a44-81a1-a267f487913c req-fb36647f-994e-4b05-b17a-6a47024501ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Processing event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.332 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.335 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121798.3353074, 1e7db515-c991-4967-b53b-01c33eaadab2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.335 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] VM Resumed (Lifecycle Event)
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.337 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.341 182729 INFO nova.virt.libvirt.driver [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance spawned successfully.
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.342 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.364 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.371 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.375 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.376 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.376 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.377 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.377 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.378 182729 DEBUG nova.virt.libvirt.driver [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.401 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:43:18 compute-0 podman[232486]: 2026-01-22 22:43:18.406706136 +0000 UTC m=+0.065036625 container create 24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.437 182729 DEBUG nova.network.neutron [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updated VIF entry in instance network info cache for port 77888806-9b6a-4b3d-a528-863c5c5801a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.437 182729 DEBUG nova.network.neutron [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updating instance_info_cache with network_info: [{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.440 182729 INFO nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Took 5.44 seconds to spawn the instance on the hypervisor.
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.440 182729 DEBUG nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:18 compute-0 systemd[1]: Started libpod-conmon-24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444.scope.
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.452 182729 DEBUG oslo_concurrency.lockutils [req-7058e845-2f2a-4816-81e4-13ac8879eabf req-a69488de-540c-4dcf-a34f-fb21c7164e3a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:43:18 compute-0 podman[232486]: 2026-01-22 22:43:18.367562688 +0000 UTC m=+0.025893217 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:43:18 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:43:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffb2e8408d6c1ac3e8a9fc89e2956992d36fafd81b02c6e9c031e4bdbdb37a45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:43:18 compute-0 podman[232486]: 2026-01-22 22:43:18.501195216 +0000 UTC m=+0.159525675 container init 24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:43:18 compute-0 podman[232504]: 2026-01-22 22:43:18.504057337 +0000 UTC m=+0.049539128 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:43:18 compute-0 podman[232486]: 2026-01-22 22:43:18.508801936 +0000 UTC m=+0.167132385 container start 24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 22:43:18 compute-0 podman[232500]: 2026-01-22 22:43:18.518766354 +0000 UTC m=+0.063600689 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:43:18 compute-0 podman[232503]: 2026-01-22 22:43:18.527888502 +0000 UTC m=+0.076530312 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:43:18 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [NOTICE]   (232568) : New worker (232570) forked
Jan 22 22:43:18 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [NOTICE]   (232568) : Loading success.
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.540 182729 INFO nova.compute.manager [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Took 6.02 seconds to build instance.
Jan 22 22:43:18 compute-0 nova_compute[182725]: 2026-01-22 22:43:18.560 182729 DEBUG oslo_concurrency.lockutils [None req-62891965-93ae-4e52-b82c-b788f590465d 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:19 compute-0 nova_compute[182725]: 2026-01-22 22:43:19.277 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:20 compute-0 nova_compute[182725]: 2026-01-22 22:43:20.433 182729 DEBUG nova.compute.manager [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:20 compute-0 nova_compute[182725]: 2026-01-22 22:43:20.434 182729 DEBUG oslo_concurrency.lockutils [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:20 compute-0 nova_compute[182725]: 2026-01-22 22:43:20.435 182729 DEBUG oslo_concurrency.lockutils [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:20 compute-0 nova_compute[182725]: 2026-01-22 22:43:20.436 182729 DEBUG oslo_concurrency.lockutils [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:20 compute-0 nova_compute[182725]: 2026-01-22 22:43:20.436 182729 DEBUG nova.compute.manager [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] No waiting events found dispatching network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:43:20 compute-0 nova_compute[182725]: 2026-01-22 22:43:20.437 182729 WARNING nova.compute.manager [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received unexpected event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 for instance with vm_state active and task_state None.
Jan 22 22:43:21 compute-0 nova_compute[182725]: 2026-01-22 22:43:21.885 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:22 compute-0 nova_compute[182725]: 2026-01-22 22:43:22.004 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:23 compute-0 NetworkManager[54954]: <info>  [1769121803.4731] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Jan 22 22:43:23 compute-0 NetworkManager[54954]: <info>  [1769121803.4755] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.476 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.606 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:23 compute-0 ovn_controller[94850]: 2026-01-22T22:43:23Z|00575|binding|INFO|Releasing lport 1da0bccc-bb1b-406e-9858-43a31781159d from this chassis (sb_readonly=0)
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.624 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.900 182729 DEBUG nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-changed-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.901 182729 DEBUG nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Refreshing instance network info cache due to event network-changed-77888806-9b6a-4b3d-a528-863c5c5801a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.902 182729 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.902 182729 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:43:23 compute-0 nova_compute[182725]: 2026-01-22 22:43:23.903 182729 DEBUG nova.network.neutron [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Refreshing network info cache for port 77888806-9b6a-4b3d-a528-863c5c5801a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:43:24 compute-0 nova_compute[182725]: 2026-01-22 22:43:24.280 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:27 compute-0 nova_compute[182725]: 2026-01-22 22:43:27.008 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:27 compute-0 nova_compute[182725]: 2026-01-22 22:43:27.631 182729 DEBUG nova.network.neutron [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updated VIF entry in instance network info cache for port 77888806-9b6a-4b3d-a528-863c5c5801a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:43:27 compute-0 nova_compute[182725]: 2026-01-22 22:43:27.632 182729 DEBUG nova.network.neutron [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updating instance_info_cache with network_info: [{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:43:27 compute-0 nova_compute[182725]: 2026-01-22 22:43:27.662 182729 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:43:28 compute-0 nova_compute[182725]: 2026-01-22 22:43:28.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:28 compute-0 nova_compute[182725]: 2026-01-22 22:43:28.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:28 compute-0 nova_compute[182725]: 2026-01-22 22:43:28.914 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:28 compute-0 nova_compute[182725]: 2026-01-22 22:43:28.914 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:28 compute-0 nova_compute[182725]: 2026-01-22 22:43:28.914 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:43:28 compute-0 nova_compute[182725]: 2026-01-22 22:43:28.978 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.052 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.054 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.112 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:29 compute-0 ovn_controller[94850]: 2026-01-22T22:43:29Z|00576|binding|INFO|Releasing lport 1da0bccc-bb1b-406e-9858-43a31781159d from this chassis (sb_readonly=0)
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.287 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.293 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.295 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5464MB free_disk=73.33110427856445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.295 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.295 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.366 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 1e7db515-c991-4967-b53b-01c33eaadab2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.366 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.366 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.407 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.423 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.439 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:43:29 compute-0 nova_compute[182725]: 2026-01-22 22:43:29.439 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:30 compute-0 nova_compute[182725]: 2026-01-22 22:43:30.439 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:30 compute-0 nova_compute[182725]: 2026-01-22 22:43:30.439 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:43:30 compute-0 nova_compute[182725]: 2026-01-22 22:43:30.439 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:43:30 compute-0 nova_compute[182725]: 2026-01-22 22:43:30.718 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:43:30 compute-0 nova_compute[182725]: 2026-01-22 22:43:30.718 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:43:30 compute-0 nova_compute[182725]: 2026-01-22 22:43:30.719 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:43:30 compute-0 nova_compute[182725]: 2026-01-22 22:43:30.719 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:30 compute-0 ovn_controller[94850]: 2026-01-22T22:43:30Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:34:7d 10.100.0.8
Jan 22 22:43:30 compute-0 ovn_controller[94850]: 2026-01-22T22:43:30Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:34:7d 10.100.0.8
Jan 22 22:43:32 compute-0 nova_compute[182725]: 2026-01-22 22:43:32.010 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:32 compute-0 podman[232597]: 2026-01-22 22:43:32.135840784 +0000 UTC m=+0.063067396 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 22 22:43:32 compute-0 nova_compute[182725]: 2026-01-22 22:43:32.336 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updating instance_info_cache with network_info: [{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:43:32 compute-0 nova_compute[182725]: 2026-01-22 22:43:32.373 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:43:32 compute-0 nova_compute[182725]: 2026-01-22 22:43:32.373 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:43:32 compute-0 nova_compute[182725]: 2026-01-22 22:43:32.374 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:32 compute-0 nova_compute[182725]: 2026-01-22 22:43:32.374 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:34 compute-0 nova_compute[182725]: 2026-01-22 22:43:34.289 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:34 compute-0 nova_compute[182725]: 2026-01-22 22:43:34.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:35 compute-0 nova_compute[182725]: 2026-01-22 22:43:35.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:36 compute-0 nova_compute[182725]: 2026-01-22 22:43:36.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:36 compute-0 nova_compute[182725]: 2026-01-22 22:43:36.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:43:37 compute-0 nova_compute[182725]: 2026-01-22 22:43:37.012 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:38 compute-0 podman[232618]: 2026-01-22 22:43:38.132649063 +0000 UTC m=+0.056443380 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 22:43:38 compute-0 podman[232617]: 2026-01-22 22:43:38.157633737 +0000 UTC m=+0.085575968 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:43:38 compute-0 nova_compute[182725]: 2026-01-22 22:43:38.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:39 compute-0 nova_compute[182725]: 2026-01-22 22:43:39.291 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.552 182729 INFO nova.compute.manager [None req-f59ec303-fb7a-4798-99e4-a37fca5df54b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Get console output
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.557 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.962 182729 DEBUG oslo_concurrency.lockutils [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.962 182729 DEBUG oslo_concurrency.lockutils [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.962 182729 DEBUG nova.compute.manager [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.966 182729 DEBUG nova.compute.manager [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.967 182729 DEBUG nova.objects.instance [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'flavor' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:40 compute-0 nova_compute[182725]: 2026-01-22 22:43:40.999 182729 DEBUG nova.objects.instance [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'info_cache' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:41 compute-0 nova_compute[182725]: 2026-01-22 22:43:41.030 182729 DEBUG nova.virt.libvirt.driver [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:43:41 compute-0 nova_compute[182725]: 2026-01-22 22:43:41.297 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:41.987 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:43:41 compute-0 nova_compute[182725]: 2026-01-22 22:43:41.988 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:41.988 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:43:42 compute-0 nova_compute[182725]: 2026-01-22 22:43:42.014 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:42.991 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:43 compute-0 kernel: tap77888806-9b (unregistering): left promiscuous mode
Jan 22 22:43:43 compute-0 NetworkManager[54954]: <info>  [1769121823.1918] device (tap77888806-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:43:43 compute-0 ovn_controller[94850]: 2026-01-22T22:43:43Z|00577|binding|INFO|Releasing lport 77888806-9b6a-4b3d-a528-863c5c5801a7 from this chassis (sb_readonly=0)
Jan 22 22:43:43 compute-0 ovn_controller[94850]: 2026-01-22T22:43:43Z|00578|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 down in Southbound
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.198 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 ovn_controller[94850]: 2026-01-22T22:43:43Z|00579|binding|INFO|Removing iface tap77888806-9b ovn-installed in OVS
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.206 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:34:7d 10.100.0.8'], port_security=['fa:16:3e:2e:34:7d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e7db515-c991-4967-b53b-01c33eaadab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c949eca9-cdeb-4643-865a-57a458362392', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.174'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5caa45f3-4398-48b7-91ee-ad81d3db5e28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=77888806-9b6a-4b3d-a528-863c5c5801a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.208 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 77888806-9b6a-4b3d-a528-863c5c5801a7 in datapath 20c3083d-5059-4bbb-a1bc-ca13d504e79c unbound from our chassis
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.209 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20c3083d-5059-4bbb-a1bc-ca13d504e79c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.210 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[58a6270e-65ce-42d3-9d3f-b2717849b7df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.211 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c namespace which is not needed anymore
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.215 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000090.scope: Deactivated successfully.
Jan 22 22:43:43 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000090.scope: Consumed 12.667s CPU time.
Jan 22 22:43:43 compute-0 systemd-machined[154006]: Machine qemu-64-instance-00000090 terminated.
Jan 22 22:43:43 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [NOTICE]   (232568) : haproxy version is 2.8.14-c23fe91
Jan 22 22:43:43 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [NOTICE]   (232568) : path to executable is /usr/sbin/haproxy
Jan 22 22:43:43 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [WARNING]  (232568) : Exiting Master process...
Jan 22 22:43:43 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [WARNING]  (232568) : Exiting Master process...
Jan 22 22:43:43 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [ALERT]    (232568) : Current worker (232570) exited with code 143 (Terminated)
Jan 22 22:43:43 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232509]: [WARNING]  (232568) : All workers exited. Exiting... (0)
Jan 22 22:43:43 compute-0 systemd[1]: libpod-24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444.scope: Deactivated successfully.
Jan 22 22:43:43 compute-0 podman[232689]: 2026-01-22 22:43:43.332629523 +0000 UTC m=+0.040679567 container died 24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:43:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-ffb2e8408d6c1ac3e8a9fc89e2956992d36fafd81b02c6e9c031e4bdbdb37a45-merged.mount: Deactivated successfully.
Jan 22 22:43:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444-userdata-shm.mount: Deactivated successfully.
Jan 22 22:43:43 compute-0 podman[232689]: 2026-01-22 22:43:43.370899959 +0000 UTC m=+0.078949993 container cleanup 24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:43:43 compute-0 systemd[1]: libpod-conmon-24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444.scope: Deactivated successfully.
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.422 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.425 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 podman[232718]: 2026-01-22 22:43:43.43380922 +0000 UTC m=+0.041874227 container remove 24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.439 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[164aa544-3538-4fbf-9926-a9e1b17bf43d]: (4, ('Thu Jan 22 10:43:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c (24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444)\n24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444\nThu Jan 22 10:43:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c (24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444)\n24aae2ac4f1b1a4e34fa3cc59846afa82c47d8358ef1ef55d0fd2b18b7ff8444\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.441 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e214e571-f5ad-4764-aa31-f4ac0a8b9ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.442 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20c3083d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.444 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 kernel: tap20c3083d-50: left promiscuous mode
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.458 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.461 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.465 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[690a40a9-9b6a-421a-b390-803fbb72b9a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.487 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[013170dc-6a2a-497b-b4b1-b1b3f8eab5a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.488 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8f41f2-5afd-4b82-93c5-7539e1b2a9ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.504 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[62f9c472-bcc9-4f9c-abaf-b7711a9a0a36]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543538, 'reachable_time': 40653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232752, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.506 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:43:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:43.506 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[02aa898a-052d-43e2-8ece-926fc37337ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d20c3083d\x2d5059\x2d4bbb\x2da1bc\x2dca13d504e79c.mount: Deactivated successfully.
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.773 182729 DEBUG nova.compute.manager [req-f392d686-ca8e-44f5-a5b9-cd8341a88f6c req-238f6274-cf89-45b4-b10e-b27a8c42a27f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-unplugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.773 182729 DEBUG oslo_concurrency.lockutils [req-f392d686-ca8e-44f5-a5b9-cd8341a88f6c req-238f6274-cf89-45b4-b10e-b27a8c42a27f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.774 182729 DEBUG oslo_concurrency.lockutils [req-f392d686-ca8e-44f5-a5b9-cd8341a88f6c req-238f6274-cf89-45b4-b10e-b27a8c42a27f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.774 182729 DEBUG oslo_concurrency.lockutils [req-f392d686-ca8e-44f5-a5b9-cd8341a88f6c req-238f6274-cf89-45b4-b10e-b27a8c42a27f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.774 182729 DEBUG nova.compute.manager [req-f392d686-ca8e-44f5-a5b9-cd8341a88f6c req-238f6274-cf89-45b4-b10e-b27a8c42a27f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] No waiting events found dispatching network-vif-unplugged-77888806-9b6a-4b3d-a528-863c5c5801a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.774 182729 WARNING nova.compute.manager [req-f392d686-ca8e-44f5-a5b9-cd8341a88f6c req-238f6274-cf89-45b4-b10e-b27a8c42a27f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received unexpected event network-vif-unplugged-77888806-9b6a-4b3d-a528-863c5c5801a7 for instance with vm_state active and task_state powering-off.
Jan 22 22:43:43 compute-0 nova_compute[182725]: 2026-01-22 22:43:43.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:43:44 compute-0 nova_compute[182725]: 2026-01-22 22:43:44.022 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:44 compute-0 nova_compute[182725]: 2026-01-22 22:43:44.043 182729 INFO nova.virt.libvirt.driver [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance shutdown successfully after 3 seconds.
Jan 22 22:43:44 compute-0 nova_compute[182725]: 2026-01-22 22:43:44.049 182729 INFO nova.virt.libvirt.driver [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance destroyed successfully.
Jan 22 22:43:44 compute-0 nova_compute[182725]: 2026-01-22 22:43:44.050 182729 DEBUG nova.objects.instance [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:44 compute-0 nova_compute[182725]: 2026-01-22 22:43:44.061 182729 DEBUG nova.compute.manager [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:44 compute-0 nova_compute[182725]: 2026-01-22 22:43:44.132 182729 DEBUG oslo_concurrency.lockutils [None req-2b4f46a6-d310-4e67-a9c0-25649a797968 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:44 compute-0 nova_compute[182725]: 2026-01-22 22:43:44.294 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:45 compute-0 nova_compute[182725]: 2026-01-22 22:43:45.873 182729 DEBUG nova.compute.manager [req-a6f89071-6150-4007-8fee-5daa9cd7c584 req-dc4a44ab-3288-4358-a7cf-a4bc42c1c420 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:45 compute-0 nova_compute[182725]: 2026-01-22 22:43:45.874 182729 DEBUG oslo_concurrency.lockutils [req-a6f89071-6150-4007-8fee-5daa9cd7c584 req-dc4a44ab-3288-4358-a7cf-a4bc42c1c420 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:45 compute-0 nova_compute[182725]: 2026-01-22 22:43:45.874 182729 DEBUG oslo_concurrency.lockutils [req-a6f89071-6150-4007-8fee-5daa9cd7c584 req-dc4a44ab-3288-4358-a7cf-a4bc42c1c420 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:45 compute-0 nova_compute[182725]: 2026-01-22 22:43:45.874 182729 DEBUG oslo_concurrency.lockutils [req-a6f89071-6150-4007-8fee-5daa9cd7c584 req-dc4a44ab-3288-4358-a7cf-a4bc42c1c420 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:45 compute-0 nova_compute[182725]: 2026-01-22 22:43:45.875 182729 DEBUG nova.compute.manager [req-a6f89071-6150-4007-8fee-5daa9cd7c584 req-dc4a44ab-3288-4358-a7cf-a4bc42c1c420 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] No waiting events found dispatching network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:43:45 compute-0 nova_compute[182725]: 2026-01-22 22:43:45.875 182729 WARNING nova.compute.manager [req-a6f89071-6150-4007-8fee-5daa9cd7c584 req-dc4a44ab-3288-4358-a7cf-a4bc42c1c420 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received unexpected event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 for instance with vm_state stopped and task_state None.
Jan 22 22:43:46 compute-0 nova_compute[182725]: 2026-01-22 22:43:46.141 182729 INFO nova.compute.manager [None req-12a979b6-4568-4a54-afc2-edd41eaed5b2 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Get console output
Jan 22 22:43:46 compute-0 nova_compute[182725]: 2026-01-22 22:43:46.410 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'flavor' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:46 compute-0 nova_compute[182725]: 2026-01-22 22:43:46.437 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'info_cache' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:46 compute-0 nova_compute[182725]: 2026-01-22 22:43:46.469 182729 DEBUG oslo_concurrency.lockutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:43:46 compute-0 nova_compute[182725]: 2026-01-22 22:43:46.470 182729 DEBUG oslo_concurrency.lockutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:43:46 compute-0 nova_compute[182725]: 2026-01-22 22:43:46.470 182729 DEBUG nova.network.neutron [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.016 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.566 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.566 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.586 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.679 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.679 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.687 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.688 182729 INFO nova.compute.claims [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.830 182729 DEBUG nova.network.neutron [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updating instance_info_cache with network_info: [{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.879 182729 DEBUG nova.compute.provider_tree [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.882 182729 DEBUG oslo_concurrency.lockutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.894 182729 DEBUG nova.scheduler.client.report [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.906 182729 INFO nova.virt.libvirt.driver [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance destroyed successfully.
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.907 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.913 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.914 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.917 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.933 182729 DEBUG nova.virt.libvirt.vif [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-603441666',display_name='tempest-TestNetworkAdvancedServerOps-server-603441666',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-603441666',id=144,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLleIPWkzqQAYXdinIykbPmLUKqTX3mJMUySjMVg8PNK1kTsfLhFOM601y1SwcipFXR2ooPQMR4c2AlBekdCT6xxXuikuCWwJ9GvoQJa8Ou8p3ZSyWYBW16vokmhFqzdnw==',key_name='tempest-TestNetworkAdvancedServerOps-658154535',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-p2g7nu2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:44Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=1e7db515-c991-4967-b53b-01c33eaadab2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.934 182729 DEBUG nova.network.os_vif_util [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.935 182729 DEBUG nova.network.os_vif_util [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.935 182729 DEBUG os_vif [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.937 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.937 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77888806-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.939 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.942 182729 INFO os_vif [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b')
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.949 182729 DEBUG nova.virt.libvirt.driver [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Start _get_guest_xml network_info=[{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.952 182729 WARNING nova.virt.libvirt.driver [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.956 182729 DEBUG nova.virt.libvirt.host [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.957 182729 DEBUG nova.virt.libvirt.host [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.960 182729 DEBUG nova.virt.libvirt.host [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.961 182729 DEBUG nova.virt.libvirt.host [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.962 182729 DEBUG nova.virt.libvirt.driver [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.962 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.963 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.963 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.963 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.963 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.964 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.964 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.964 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.964 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.964 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.965 182729 DEBUG nova.virt.hardware [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.965 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.969 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.970 182729 DEBUG nova.network.neutron [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:43:47 compute-0 nova_compute[182725]: 2026-01-22 22:43:47.992 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.012 182729 INFO nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.031 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.050 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.config --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.051 182729 DEBUG oslo_concurrency.lockutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.052 182729 DEBUG oslo_concurrency.lockutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.052 182729 DEBUG oslo_concurrency.lockutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.053 182729 DEBUG nova.virt.libvirt.vif [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-603441666',display_name='tempest-TestNetworkAdvancedServerOps-server-603441666',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-603441666',id=144,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLleIPWkzqQAYXdinIykbPmLUKqTX3mJMUySjMVg8PNK1kTsfLhFOM601y1SwcipFXR2ooPQMR4c2AlBekdCT6xxXuikuCWwJ9GvoQJa8Ou8p3ZSyWYBW16vokmhFqzdnw==',key_name='tempest-TestNetworkAdvancedServerOps-658154535',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-p2g7nu2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:44Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=1e7db515-c991-4967-b53b-01c33eaadab2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.054 182729 DEBUG nova.network.os_vif_util [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.055 182729 DEBUG nova.network.os_vif_util [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.056 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.078 182729 DEBUG nova.virt.libvirt.driver [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <uuid>1e7db515-c991-4967-b53b-01c33eaadab2</uuid>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <name>instance-00000090</name>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-603441666</nova:name>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:43:47</nova:creationTime>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         <nova:port uuid="77888806-9b6a-4b3d-a528-863c5c5801a7">
Jan 22 22:43:48 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <system>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <entry name="serial">1e7db515-c991-4967-b53b-01c33eaadab2</entry>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <entry name="uuid">1e7db515-c991-4967-b53b-01c33eaadab2</entry>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </system>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <os>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   </os>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <features>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   </features>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk.config"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:2e:34:7d"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <target dev="tap77888806-9b"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/console.log" append="off"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <video>
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </video>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <input type="keyboard" bus="usb"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:43:48 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:43:48 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:43:48 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:43:48 compute-0 nova_compute[182725]: </domain>
Jan 22 22:43:48 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.081 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.137 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.138 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.158 182729 DEBUG nova.policy [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.164 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.165 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.166 182729 INFO nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Creating image(s)
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.166 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.167 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.168 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.184 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.203 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.204 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.220 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.240 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.241 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.241 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.252 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.280 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.281 182729 DEBUG nova.virt.disk.api [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.282 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.309 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.310 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.341 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.342 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.343 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.362 182729 DEBUG oslo_concurrency.processutils [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.363 182729 DEBUG nova.virt.disk.api [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.363 182729 DEBUG nova.objects.instance [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.384 182729 DEBUG nova.virt.libvirt.vif [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-603441666',display_name='tempest-TestNetworkAdvancedServerOps-server-603441666',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-603441666',id=144,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLleIPWkzqQAYXdinIykbPmLUKqTX3mJMUySjMVg8PNK1kTsfLhFOM601y1SwcipFXR2ooPQMR4c2AlBekdCT6xxXuikuCWwJ9GvoQJa8Ou8p3ZSyWYBW16vokmhFqzdnw==',key_name='tempest-TestNetworkAdvancedServerOps-658154535',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-p2g7nu2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:44Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=1e7db515-c991-4967-b53b-01c33eaadab2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.384 182729 DEBUG nova.network.os_vif_util [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.385 182729 DEBUG nova.network.os_vif_util [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.387 182729 DEBUG os_vif [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.388 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.388 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.389 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.391 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.391 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77888806-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.392 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77888806-9b, col_values=(('external_ids', {'iface-id': '77888806-9b6a-4b3d-a528-863c5c5801a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:34:7d', 'vm-uuid': '1e7db515-c991-4967-b53b-01c33eaadab2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.393 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 NetworkManager[54954]: <info>  [1769121828.3945] manager: (tap77888806-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.395 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.397 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.398 182729 INFO os_vif [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b')
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.403 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.403 182729 DEBUG nova.virt.disk.api [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Checking if we can resize image /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.404 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.461 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.462 182729 DEBUG nova.virt.disk.api [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Cannot resize image /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.462 182729 DEBUG nova.objects.instance [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'migration_context' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:48 compute-0 kernel: tap77888806-9b: entered promiscuous mode
Jan 22 22:43:48 compute-0 NetworkManager[54954]: <info>  [1769121828.4703] manager: (tap77888806-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Jan 22 22:43:48 compute-0 ovn_controller[94850]: 2026-01-22T22:43:48Z|00580|binding|INFO|Claiming lport 77888806-9b6a-4b3d-a528-863c5c5801a7 for this chassis.
Jan 22 22:43:48 compute-0 ovn_controller[94850]: 2026-01-22T22:43:48Z|00581|binding|INFO|77888806-9b6a-4b3d-a528-863c5c5801a7: Claiming fa:16:3e:2e:34:7d 10.100.0.8
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.471 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.478 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.478 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Ensure instance console log exists: /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.479 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.479 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.479 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.481 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:34:7d 10.100.0.8'], port_security=['fa:16:3e:2e:34:7d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e7db515-c991-4967-b53b-01c33eaadab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c949eca9-cdeb-4643-865a-57a458362392', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.174'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5caa45f3-4398-48b7-91ee-ad81d3db5e28, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=77888806-9b6a-4b3d-a528-863c5c5801a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.483 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 77888806-9b6a-4b3d-a528-863c5c5801a7 in datapath 20c3083d-5059-4bbb-a1bc-ca13d504e79c bound to our chassis
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.484 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20c3083d-5059-4bbb-a1bc-ca13d504e79c
Jan 22 22:43:48 compute-0 ovn_controller[94850]: 2026-01-22T22:43:48Z|00582|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 ovn-installed in OVS
Jan 22 22:43:48 compute-0 ovn_controller[94850]: 2026-01-22T22:43:48Z|00583|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 up in Southbound
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.485 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.496 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[eadd714f-1e8d-4578-8bb1-d50da0367384]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.496 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20c3083d-51 in ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:43:48 compute-0 systemd-udevd[232800]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.498 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20c3083d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.498 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8254f230-7073-4a9c-9094-27e7ef62f054]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.499 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[688a2da8-a13f-49f3-be3f-e0e6b0fa3b03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 NetworkManager[54954]: <info>  [1769121828.5079] device (tap77888806-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:43:48 compute-0 NetworkManager[54954]: <info>  [1769121828.5089] device (tap77888806-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.510 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9c6118ff-9071-41ea-8f7c-6f0b63c41269]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 systemd-machined[154006]: New machine qemu-65-instance-00000090.
Jan 22 22:43:48 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-00000090.
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.533 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[19dda0de-8d5a-49c1-8890-831062fe4ab7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.563 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f24b4268-3515-48ec-a976-24d79a83aa0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.568 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1d286579-4872-4e00-8c06-f8c33a5a4309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 systemd-udevd[232804]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:43:48 compute-0 NetworkManager[54954]: <info>  [1769121828.5777] manager: (tap20c3083d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/271)
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.606 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c45a42-33c9-4608-bea6-2e02055e8bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.612 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[94dce2ad-b287-401b-8b24-f747da5f8245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 podman[232806]: 2026-01-22 22:43:48.61782864 +0000 UTC m=+0.069570519 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:43:48 compute-0 NetworkManager[54954]: <info>  [1769121828.6340] device (tap20c3083d-50): carrier: link connected
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.639 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a2400d34-d36a-41f5-8375-76afcdb53272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 podman[232808]: 2026-01-22 22:43:48.653590493 +0000 UTC m=+0.100599783 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.655 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[030a7d94-9c7b-412e-b44e-c2181135a761]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20c3083d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:a9:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546623, 'reachable_time': 36343, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232897, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 podman[232809]: 2026-01-22 22:43:48.660633829 +0000 UTC m=+0.106003168 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.671 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9e81690d-9993-4f46-a4ec-1d7027550859]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe87:a9b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546623, 'tstamp': 546623}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232898, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.686 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4d418e-2d4c-40a8-9eac-60a6df9aa24c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20c3083d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:a9:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546623, 'reachable_time': 36343, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232899, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.713 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[73d16db5-fc1f-4830-adec-4fbe936ba91e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.766 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fe799cbf-a356-49e3-a961-5e2780312645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.768 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20c3083d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.768 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.769 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20c3083d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:48 compute-0 NetworkManager[54954]: <info>  [1769121828.7719] manager: (tap20c3083d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Jan 22 22:43:48 compute-0 kernel: tap20c3083d-50: entered promiscuous mode
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.776 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20c3083d-50, col_values=(('external_ids', {'iface-id': '1da0bccc-bb1b-406e-9858-43a31781159d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:48 compute-0 ovn_controller[94850]: 2026-01-22T22:43:48Z|00584|binding|INFO|Releasing lport 1da0bccc-bb1b-406e-9858-43a31781159d from this chassis (sb_readonly=0)
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.789 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 nova_compute[182725]: 2026-01-22 22:43:48.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.793 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20c3083d-5059-4bbb-a1bc-ca13d504e79c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20c3083d-5059-4bbb-a1bc-ca13d504e79c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.794 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[10966494-c78c-4ee5-9d0a-acb44c8fe0b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.794 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-20c3083d-5059-4bbb-a1bc-ca13d504e79c
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/20c3083d-5059-4bbb-a1bc-ca13d504e79c.pid.haproxy
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 20c3083d-5059-4bbb-a1bc-ca13d504e79c
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:43:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:48.795 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'env', 'PROCESS_TAG=haproxy-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20c3083d-5059-4bbb-a1bc-ca13d504e79c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.015 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 1e7db515-c991-4967-b53b-01c33eaadab2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.015 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121829.0149457, 1e7db515-c991-4967-b53b-01c33eaadab2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.016 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] VM Resumed (Lifecycle Event)
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.018 182729 DEBUG nova.compute.manager [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.023 182729 INFO nova.virt.libvirt.driver [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance rebooted successfully.
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.023 182729 DEBUG nova.compute.manager [None req-ac8f6026-fbfc-47b2-97f5-3ff76a0818c1 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.036 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.040 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.059 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] During sync_power_state the instance has a pending task (powering-on). Skip.
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.059 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121829.017611, 1e7db515-c991-4967-b53b-01c33eaadab2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.059 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] VM Started (Lifecycle Event)
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.093 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.097 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:43:49 compute-0 podman[232936]: 2026-01-22 22:43:49.177655411 +0000 UTC m=+0.051689552 container create ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 22:43:49 compute-0 systemd[1]: Started libpod-conmon-ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589.scope.
Jan 22 22:43:49 compute-0 podman[232936]: 2026-01-22 22:43:49.14839447 +0000 UTC m=+0.022428621 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:43:49 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:43:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57db1d6ec895c9561dedcb73c921ea07a74223d97a89b45ad684af58101397bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:43:49 compute-0 podman[232936]: 2026-01-22 22:43:49.268885699 +0000 UTC m=+0.142919830 container init ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:43:49 compute-0 podman[232936]: 2026-01-22 22:43:49.274504109 +0000 UTC m=+0.148538240 container start ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.296 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:49 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232952]: [NOTICE]   (232956) : New worker (232958) forked
Jan 22 22:43:49 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232952]: [NOTICE]   (232956) : Loading success.
Jan 22 22:43:49 compute-0 nova_compute[182725]: 2026-01-22 22:43:49.992 182729 DEBUG nova.network.neutron [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Successfully created port: 960537b2-fe8a-48ce-ace9-b39c09a20598 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:43:50 compute-0 nova_compute[182725]: 2026-01-22 22:43:50.105 182729 DEBUG nova.compute.manager [req-00872c29-2df6-479c-ac87-80258cc60553 req-115529a3-6992-4796-bcc4-cef37cad4cb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:50 compute-0 nova_compute[182725]: 2026-01-22 22:43:50.106 182729 DEBUG oslo_concurrency.lockutils [req-00872c29-2df6-479c-ac87-80258cc60553 req-115529a3-6992-4796-bcc4-cef37cad4cb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:50 compute-0 nova_compute[182725]: 2026-01-22 22:43:50.106 182729 DEBUG oslo_concurrency.lockutils [req-00872c29-2df6-479c-ac87-80258cc60553 req-115529a3-6992-4796-bcc4-cef37cad4cb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:50 compute-0 nova_compute[182725]: 2026-01-22 22:43:50.106 182729 DEBUG oslo_concurrency.lockutils [req-00872c29-2df6-479c-ac87-80258cc60553 req-115529a3-6992-4796-bcc4-cef37cad4cb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:50 compute-0 nova_compute[182725]: 2026-01-22 22:43:50.106 182729 DEBUG nova.compute.manager [req-00872c29-2df6-479c-ac87-80258cc60553 req-115529a3-6992-4796-bcc4-cef37cad4cb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] No waiting events found dispatching network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:43:50 compute-0 nova_compute[182725]: 2026-01-22 22:43:50.106 182729 WARNING nova.compute.manager [req-00872c29-2df6-479c-ac87-80258cc60553 req-115529a3-6992-4796-bcc4-cef37cad4cb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received unexpected event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 for instance with vm_state active and task_state None.
Jan 22 22:43:52 compute-0 nova_compute[182725]: 2026-01-22 22:43:52.188 182729 DEBUG nova.compute.manager [req-37843c32-fea9-4c52-8957-d5f8ca84db58 req-6ae94ae5-d166-49cb-b40c-37886ca8174a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:52 compute-0 nova_compute[182725]: 2026-01-22 22:43:52.189 182729 DEBUG oslo_concurrency.lockutils [req-37843c32-fea9-4c52-8957-d5f8ca84db58 req-6ae94ae5-d166-49cb-b40c-37886ca8174a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:52 compute-0 nova_compute[182725]: 2026-01-22 22:43:52.189 182729 DEBUG oslo_concurrency.lockutils [req-37843c32-fea9-4c52-8957-d5f8ca84db58 req-6ae94ae5-d166-49cb-b40c-37886ca8174a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:52 compute-0 nova_compute[182725]: 2026-01-22 22:43:52.190 182729 DEBUG oslo_concurrency.lockutils [req-37843c32-fea9-4c52-8957-d5f8ca84db58 req-6ae94ae5-d166-49cb-b40c-37886ca8174a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:52 compute-0 nova_compute[182725]: 2026-01-22 22:43:52.190 182729 DEBUG nova.compute.manager [req-37843c32-fea9-4c52-8957-d5f8ca84db58 req-6ae94ae5-d166-49cb-b40c-37886ca8174a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] No waiting events found dispatching network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:43:52 compute-0 nova_compute[182725]: 2026-01-22 22:43:52.190 182729 WARNING nova.compute.manager [req-37843c32-fea9-4c52-8957-d5f8ca84db58 req-6ae94ae5-d166-49cb-b40c-37886ca8174a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received unexpected event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 for instance with vm_state active and task_state None.
Jan 22 22:43:53 compute-0 nova_compute[182725]: 2026-01-22 22:43:53.396 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.298 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.677 182729 DEBUG nova.network.neutron [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Successfully updated port: 960537b2-fe8a-48ce-ace9-b39c09a20598 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.973 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.974 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquired lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.975 182729 DEBUG nova.network.neutron [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.980 182729 DEBUG nova.compute.manager [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-changed-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.981 182729 DEBUG nova.compute.manager [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Refreshing instance network info cache due to event network-changed-960537b2-fe8a-48ce-ace9-b39c09a20598. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:43:54 compute-0 nova_compute[182725]: 2026-01-22 22:43:54.982 182729 DEBUG oslo_concurrency.lockutils [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:43:55 compute-0 nova_compute[182725]: 2026-01-22 22:43:55.586 182729 DEBUG nova.network.neutron [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.787 182729 DEBUG nova.network.neutron [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Updating instance_info_cache with network_info: [{"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.811 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Releasing lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.812 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance network_info: |[{"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.812 182729 DEBUG oslo_concurrency.lockutils [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.813 182729 DEBUG nova.network.neutron [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Refreshing network info cache for port 960537b2-fe8a-48ce-ace9-b39c09a20598 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.815 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Start _get_guest_xml network_info=[{"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.819 182729 WARNING nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.825 182729 DEBUG nova.virt.libvirt.host [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.826 182729 DEBUG nova.virt.libvirt.host [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.833 182729 DEBUG nova.virt.libvirt.host [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.834 182729 DEBUG nova.virt.libvirt.host [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.835 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.835 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.836 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.836 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.836 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.837 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.837 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.837 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.837 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.838 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.838 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.838 182729 DEBUG nova.virt.hardware [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.841 182729 DEBUG nova.virt.libvirt.vif [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1047355164',display_name='tempest-ServerRescueTestJSON-server-1047355164',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1047355164',id=147,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-2pi3nzyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:48Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=02611d7b-484c-4089-9de6-712e22cef735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.842 182729 DEBUG nova.network.os_vif_util [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.842 182729 DEBUG nova.network.os_vif_util [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.844 182729 DEBUG nova.objects.instance [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.856 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <uuid>02611d7b-484c-4089-9de6-712e22cef735</uuid>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <name>instance-00000093</name>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerRescueTestJSON-server-1047355164</nova:name>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:43:56</nova:creationTime>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:user uuid="21487f95977a444e83139b6e5faf83ce">tempest-ServerRescueTestJSON-697248807-project-member</nova:user>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:project uuid="c005f10296264b39a882736d172d2b47">tempest-ServerRescueTestJSON-697248807</nova:project>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         <nova:port uuid="960537b2-fe8a-48ce-ace9-b39c09a20598">
Jan 22 22:43:56 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <system>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <entry name="serial">02611d7b-484c-4089-9de6-712e22cef735</entry>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <entry name="uuid">02611d7b-484c-4089-9de6-712e22cef735</entry>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </system>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <os>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   </os>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <features>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   </features>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:d5:88:cb"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <target dev="tap960537b2-fe"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/console.log" append="off"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <video>
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </video>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:43:56 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:43:56 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:43:56 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:43:56 compute-0 nova_compute[182725]: </domain>
Jan 22 22:43:56 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.858 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Preparing to wait for external event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.859 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.859 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.859 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.860 182729 DEBUG nova.virt.libvirt.vif [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1047355164',display_name='tempest-ServerRescueTestJSON-server-1047355164',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1047355164',id=147,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-2pi3nzyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:48Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=02611d7b-484c-4089-9de6-712e22cef735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.860 182729 DEBUG nova.network.os_vif_util [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.861 182729 DEBUG nova.network.os_vif_util [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.861 182729 DEBUG os_vif [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.862 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.862 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.862 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.867 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.868 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap960537b2-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.869 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap960537b2-fe, col_values=(('external_ids', {'iface-id': '960537b2-fe8a-48ce-ace9-b39c09a20598', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:88:cb', 'vm-uuid': '02611d7b-484c-4089-9de6-712e22cef735'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:43:56 compute-0 NetworkManager[54954]: <info>  [1769121836.8732] manager: (tap960537b2-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.875 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.878 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.880 182729 INFO os_vif [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe')
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.935 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.935 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.936 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No VIF found with MAC fa:16:3e:d5:88:cb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:43:56 compute-0 nova_compute[182725]: 2026-01-22 22:43:56.936 182729 INFO nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Using config drive
Jan 22 22:43:57 compute-0 nova_compute[182725]: 2026-01-22 22:43:57.928 182729 INFO nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Creating config drive at /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config
Jan 22 22:43:57 compute-0 nova_compute[182725]: 2026-01-22 22:43:57.936 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwqpbv1e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.066 182729 DEBUG oslo_concurrency.processutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwqpbv1e" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:43:58 compute-0 kernel: tap960537b2-fe: entered promiscuous mode
Jan 22 22:43:58 compute-0 NetworkManager[54954]: <info>  [1769121838.1324] manager: (tap960537b2-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/274)
Jan 22 22:43:58 compute-0 ovn_controller[94850]: 2026-01-22T22:43:58Z|00585|binding|INFO|Claiming lport 960537b2-fe8a-48ce-ace9-b39c09a20598 for this chassis.
Jan 22 22:43:58 compute-0 ovn_controller[94850]: 2026-01-22T22:43:58Z|00586|binding|INFO|960537b2-fe8a-48ce-ace9-b39c09a20598: Claiming fa:16:3e:d5:88:cb 10.100.0.12
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.133 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:58.139 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:88:cb 10.100.0.12'], port_security=['fa:16:3e:d5:88:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02611d7b-484c-4089-9de6-712e22cef735', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=960537b2-fe8a-48ce-ace9-b39c09a20598) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:43:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:58.141 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 960537b2-fe8a-48ce-ace9-b39c09a20598 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 bound to our chassis
Jan 22 22:43:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:58.142 104215 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 22:43:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:43:58.143 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[636ccf8a-5560-4919-ad8a-646e092d45e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.146 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:58 compute-0 ovn_controller[94850]: 2026-01-22T22:43:58Z|00587|binding|INFO|Setting lport 960537b2-fe8a-48ce-ace9-b39c09a20598 ovn-installed in OVS
Jan 22 22:43:58 compute-0 ovn_controller[94850]: 2026-01-22T22:43:58Z|00588|binding|INFO|Setting lport 960537b2-fe8a-48ce-ace9-b39c09a20598 up in Southbound
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.152 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:58 compute-0 systemd-machined[154006]: New machine qemu-66-instance-00000093.
Jan 22 22:43:58 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-00000093.
Jan 22 22:43:58 compute-0 systemd-udevd[232989]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:43:58 compute-0 NetworkManager[54954]: <info>  [1769121838.2170] device (tap960537b2-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:43:58 compute-0 NetworkManager[54954]: <info>  [1769121838.2179] device (tap960537b2-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.525 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121838.5251477, 02611d7b-484c-4089-9de6-712e22cef735 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.527 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] VM Started (Lifecycle Event)
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.564 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.568 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121838.5253036, 02611d7b-484c-4089-9de6-712e22cef735 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.568 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] VM Paused (Lifecycle Event)
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.618 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.621 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:43:58 compute-0 nova_compute[182725]: 2026-01-22 22:43:58.665 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:43:59 compute-0 nova_compute[182725]: 2026-01-22 22:43:59.299 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:43:59 compute-0 nova_compute[182725]: 2026-01-22 22:43:59.852 182729 DEBUG nova.network.neutron [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Updated VIF entry in instance network info cache for port 960537b2-fe8a-48ce-ace9-b39c09a20598. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:43:59 compute-0 nova_compute[182725]: 2026-01-22 22:43:59.852 182729 DEBUG nova.network.neutron [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Updating instance_info_cache with network_info: [{"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:43:59 compute-0 nova_compute[182725]: 2026-01-22 22:43:59.869 182729 DEBUG oslo_concurrency.lockutils [req-7cec6d5f-922c-4d34-aa04-9dfc20c37150 req-ac7c2a66-1329-4595-ae01-65f72838da14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.818 182729 DEBUG nova.compute.manager [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.819 182729 DEBUG oslo_concurrency.lockutils [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.819 182729 DEBUG oslo_concurrency.lockutils [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.819 182729 DEBUG oslo_concurrency.lockutils [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.820 182729 DEBUG nova.compute.manager [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Processing event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.820 182729 DEBUG nova.compute.manager [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.820 182729 DEBUG oslo_concurrency.lockutils [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.821 182729 DEBUG oslo_concurrency.lockutils [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.821 182729 DEBUG oslo_concurrency.lockutils [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.821 182729 DEBUG nova.compute.manager [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] No waiting events found dispatching network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.821 182729 WARNING nova.compute.manager [req-0ad842d0-0da6-4826-baf7-bda8bcb66ea0 req-768e410e-7777-4044-b24a-386fd539cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received unexpected event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 for instance with vm_state building and task_state spawning.
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.822 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.825 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121840.8257227, 02611d7b-484c-4089-9de6-712e22cef735 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.826 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] VM Resumed (Lifecycle Event)
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.828 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.831 182729 INFO nova.virt.libvirt.driver [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance spawned successfully.
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.832 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.855 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.856 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.856 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.857 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.858 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.858 182729 DEBUG nova.virt.libvirt.driver [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.864 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.868 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.902 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.932 182729 INFO nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Took 12.77 seconds to spawn the instance on the hypervisor.
Jan 22 22:44:00 compute-0 nova_compute[182725]: 2026-01-22 22:44:00.932 182729 DEBUG nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:44:01 compute-0 nova_compute[182725]: 2026-01-22 22:44:01.005 182729 INFO nova.compute.manager [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Took 13.36 seconds to build instance.
Jan 22 22:44:01 compute-0 nova_compute[182725]: 2026-01-22 22:44:01.029 182729 DEBUG oslo_concurrency.lockutils [None req-7c5b2f1c-8969-4993-95c8-b7d9262ff4bb 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.463s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:01 compute-0 ovn_controller[94850]: 2026-01-22T22:44:01Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:34:7d 10.100.0.8
Jan 22 22:44:01 compute-0 nova_compute[182725]: 2026-01-22 22:44:01.872 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:03 compute-0 podman[233012]: 2026-01-22 22:44:03.159841158 +0000 UTC m=+0.086901121 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:44:03 compute-0 nova_compute[182725]: 2026-01-22 22:44:03.244 182729 INFO nova.compute.manager [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Rescuing
Jan 22 22:44:03 compute-0 nova_compute[182725]: 2026-01-22 22:44:03.244 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:44:03 compute-0 nova_compute[182725]: 2026-01-22 22:44:03.245 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquired lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:44:03 compute-0 nova_compute[182725]: 2026-01-22 22:44:03.245 182729 DEBUG nova.network.neutron [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:44:04 compute-0 nova_compute[182725]: 2026-01-22 22:44:04.302 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:06 compute-0 nova_compute[182725]: 2026-01-22 22:44:06.054 182729 DEBUG nova.network.neutron [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Updating instance_info_cache with network_info: [{"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:44:06 compute-0 nova_compute[182725]: 2026-01-22 22:44:06.075 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Releasing lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:44:06 compute-0 nova_compute[182725]: 2026-01-22 22:44:06.408 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 22:44:06 compute-0 nova_compute[182725]: 2026-01-22 22:44:06.874 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:07 compute-0 nova_compute[182725]: 2026-01-22 22:44:07.111 182729 INFO nova.compute.manager [None req-e4007055-4bba-471b-91c8-07dc07308645 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Get console output
Jan 22 22:44:07 compute-0 nova_compute[182725]: 2026-01-22 22:44:07.117 211457 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.131 182729 DEBUG nova.compute.manager [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-changed-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.132 182729 DEBUG nova.compute.manager [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Refreshing instance network info cache due to event network-changed-77888806-9b6a-4b3d-a528-863c5c5801a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.132 182729 DEBUG oslo_concurrency.lockutils [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.132 182729 DEBUG oslo_concurrency.lockutils [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.132 182729 DEBUG nova.network.neutron [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Refreshing network info cache for port 77888806-9b6a-4b3d-a528-863c5c5801a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.231 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.232 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.232 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.232 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.233 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.246 182729 INFO nova.compute.manager [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Terminating instance
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.261 182729 DEBUG nova.compute.manager [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:44:08 compute-0 kernel: tap77888806-9b (unregistering): left promiscuous mode
Jan 22 22:44:08 compute-0 NetworkManager[54954]: <info>  [1769121848.2946] device (tap77888806-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.296 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00589|binding|INFO|Releasing lport 77888806-9b6a-4b3d-a528-863c5c5801a7 from this chassis (sb_readonly=0)
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00590|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 down in Southbound
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00591|binding|INFO|Removing iface tap77888806-9b ovn-installed in OVS
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.299 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.306 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:34:7d 10.100.0.8'], port_security=['fa:16:3e:2e:34:7d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e7db515-c991-4967-b53b-01c33eaadab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c949eca9-cdeb-4643-865a-57a458362392', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5caa45f3-4398-48b7-91ee-ad81d3db5e28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=77888806-9b6a-4b3d-a528-863c5c5801a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.307 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 77888806-9b6a-4b3d-a528-863c5c5801a7 in datapath 20c3083d-5059-4bbb-a1bc-ca13d504e79c unbound from our chassis
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.308 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20c3083d-5059-4bbb-a1bc-ca13d504e79c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.309 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[eea6c59e-ae6e-4115-a386-0fecf860499a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.310 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c namespace which is not needed anymore
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.329 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000090.scope: Deactivated successfully.
Jan 22 22:44:08 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000090.scope: Consumed 12.731s CPU time.
Jan 22 22:44:08 compute-0 systemd-machined[154006]: Machine qemu-65-instance-00000090 terminated.
Jan 22 22:44:08 compute-0 podman[233036]: 2026-01-22 22:44:08.406331969 +0000 UTC m=+0.079036395 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Jan 22 22:44:08 compute-0 podman[233034]: 2026-01-22 22:44:08.437114988 +0000 UTC m=+0.103947377 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:44:08 compute-0 kernel: tap77888806-9b: entered promiscuous mode
Jan 22 22:44:08 compute-0 kernel: tap77888806-9b (unregistering): left promiscuous mode
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00592|binding|INFO|Claiming lport 77888806-9b6a-4b3d-a528-863c5c5801a7 for this chassis.
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00593|binding|INFO|77888806-9b6a-4b3d-a528-863c5c5801a7: Claiming fa:16:3e:2e:34:7d 10.100.0.8
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.483 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232952]: [NOTICE]   (232956) : haproxy version is 2.8.14-c23fe91
Jan 22 22:44:08 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232952]: [NOTICE]   (232956) : path to executable is /usr/sbin/haproxy
Jan 22 22:44:08 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232952]: [WARNING]  (232956) : Exiting Master process...
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.495 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:34:7d 10.100.0.8'], port_security=['fa:16:3e:2e:34:7d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e7db515-c991-4967-b53b-01c33eaadab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c949eca9-cdeb-4643-865a-57a458362392', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5caa45f3-4398-48b7-91ee-ad81d3db5e28, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=77888806-9b6a-4b3d-a528-863c5c5801a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:08 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232952]: [ALERT]    (232956) : Current worker (232958) exited with code 143 (Terminated)
Jan 22 22:44:08 compute-0 neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c[232952]: [WARNING]  (232956) : All workers exited. Exiting... (0)
Jan 22 22:44:08 compute-0 systemd[1]: libpod-ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589.scope: Deactivated successfully.
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00594|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 ovn-installed in OVS
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00595|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 up in Southbound
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00596|binding|INFO|Releasing lport 77888806-9b6a-4b3d-a528-863c5c5801a7 from this chassis (sb_readonly=1)
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.506 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00597|if_status|INFO|Dropped 2 log messages in last 682 seconds (most recently, 682 seconds ago) due to excessive rate
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00598|if_status|INFO|Not setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 down as sb is readonly
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00599|binding|INFO|Removing iface tap77888806-9b ovn-installed in OVS
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.507 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.508 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 podman[233098]: 2026-01-22 22:44:08.509384733 +0000 UTC m=+0.065393764 container died ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00600|binding|INFO|Releasing lport 77888806-9b6a-4b3d-a528-863c5c5801a7 from this chassis (sb_readonly=0)
Jan 22 22:44:08 compute-0 ovn_controller[94850]: 2026-01-22T22:44:08Z|00601|binding|INFO|Setting lport 77888806-9b6a-4b3d-a528-863c5c5801a7 down in Southbound
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.521 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:34:7d 10.100.0.8'], port_security=['fa:16:3e:2e:34:7d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e7db515-c991-4967-b53b-01c33eaadab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c949eca9-cdeb-4643-865a-57a458362392', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5caa45f3-4398-48b7-91ee-ad81d3db5e28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=77888806-9b6a-4b3d-a528-863c5c5801a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.522 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589-userdata-shm.mount: Deactivated successfully.
Jan 22 22:44:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-57db1d6ec895c9561dedcb73c921ea07a74223d97a89b45ad684af58101397bc-merged.mount: Deactivated successfully.
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.547 182729 INFO nova.virt.libvirt.driver [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance destroyed successfully.
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.548 182729 DEBUG nova.objects.instance [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid 1e7db515-c991-4967-b53b-01c33eaadab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:08 compute-0 podman[233098]: 2026-01-22 22:44:08.552502109 +0000 UTC m=+0.108511130 container cleanup ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:44:08 compute-0 systemd[1]: libpod-conmon-ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589.scope: Deactivated successfully.
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.561 182729 DEBUG nova.virt.libvirt.vif [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-603441666',display_name='tempest-TestNetworkAdvancedServerOps-server-603441666',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-603441666',id=144,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLleIPWkzqQAYXdinIykbPmLUKqTX3mJMUySjMVg8PNK1kTsfLhFOM601y1SwcipFXR2ooPQMR4c2AlBekdCT6xxXuikuCWwJ9GvoQJa8Ou8p3ZSyWYBW16vokmhFqzdnw==',key_name='tempest-TestNetworkAdvancedServerOps-658154535',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-p2g7nu2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:49Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=1e7db515-c991-4967-b53b-01c33eaadab2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.562 182729 DEBUG nova.network.os_vif_util [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.562 182729 DEBUG nova.network.os_vif_util [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.563 182729 DEBUG os_vif [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.564 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.564 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77888806-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.567 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.569 182729 INFO os_vif [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:34:7d,bridge_name='br-int',has_traffic_filtering=True,id=77888806-9b6a-4b3d-a528-863c5c5801a7,network=Network(20c3083d-5059-4bbb-a1bc-ca13d504e79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77888806-9b')
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.569 182729 INFO nova.virt.libvirt.driver [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Deleting instance files /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2_del
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.570 182729 INFO nova.virt.libvirt.driver [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Deletion of /var/lib/nova/instances/1e7db515-c991-4967-b53b-01c33eaadab2_del complete
Jan 22 22:44:08 compute-0 podman[233141]: 2026-01-22 22:44:08.619497313 +0000 UTC m=+0.042629796 container remove ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.624 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[31c20a30-2184-475f-a58c-068981965a4b]: (4, ('Thu Jan 22 10:44:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c (ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589)\nee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589\nThu Jan 22 10:44:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c (ee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589)\nee8ace69427c6df45313e825bb81338f5f102ac685827143a473a11099472589\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.625 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ef83d387-68fa-44ce-89dc-489dd512b4ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.626 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20c3083d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.628 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 kernel: tap20c3083d-50: left promiscuous mode
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.639 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.641 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[52aa258c-70f5-4b08-a34e-0fb67fe9dd85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.650 182729 INFO nova.compute.manager [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.652 182729 DEBUG oslo.service.loopingcall [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.652 182729 DEBUG nova.compute.manager [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.652 182729 DEBUG nova.network.neutron [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.658 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e22e7f9d-bf87-4b46-9a62-98dcd3948b38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.660 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff1ffce-e7f6-4771-86d9-d9b8a7e8a0da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.677 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2e994656-312d-4fa4-b0f1-7305a909e935]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546616, 'reachable_time': 18145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233156, 'error': None, 'target': 'ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d20c3083d\x2d5059\x2d4bbb\x2da1bc\x2dca13d504e79c.mount: Deactivated successfully.
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.680 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20c3083d-5059-4bbb-a1bc-ca13d504e79c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.680 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2cad38-ef6e-4bf9-a728-0ca4851d2a00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.682 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 77888806-9b6a-4b3d-a528-863c5c5801a7 in datapath 20c3083d-5059-4bbb-a1bc-ca13d504e79c unbound from our chassis
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.684 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20c3083d-5059-4bbb-a1bc-ca13d504e79c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.685 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[df4ebb93-dbef-4efd-bca6-38a9be8ddc33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.685 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 77888806-9b6a-4b3d-a528-863c5c5801a7 in datapath 20c3083d-5059-4bbb-a1bc-ca13d504e79c unbound from our chassis
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.687 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20c3083d-5059-4bbb-a1bc-ca13d504e79c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:44:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:08.687 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[670a19ec-81ea-4c23-8cff-ee1b65c0176b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.843 182729 DEBUG nova.compute.manager [req-c48cf0a5-1999-4c70-b089-f43b609eac52 req-d37c4b26-b307-4e90-a66d-fde4c0bc52ca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-unplugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.844 182729 DEBUG oslo_concurrency.lockutils [req-c48cf0a5-1999-4c70-b089-f43b609eac52 req-d37c4b26-b307-4e90-a66d-fde4c0bc52ca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.844 182729 DEBUG oslo_concurrency.lockutils [req-c48cf0a5-1999-4c70-b089-f43b609eac52 req-d37c4b26-b307-4e90-a66d-fde4c0bc52ca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.844 182729 DEBUG oslo_concurrency.lockutils [req-c48cf0a5-1999-4c70-b089-f43b609eac52 req-d37c4b26-b307-4e90-a66d-fde4c0bc52ca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.844 182729 DEBUG nova.compute.manager [req-c48cf0a5-1999-4c70-b089-f43b609eac52 req-d37c4b26-b307-4e90-a66d-fde4c0bc52ca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] No waiting events found dispatching network-vif-unplugged-77888806-9b6a-4b3d-a528-863c5c5801a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:44:08 compute-0 nova_compute[182725]: 2026-01-22 22:44:08.845 182729 DEBUG nova.compute.manager [req-c48cf0a5-1999-4c70-b089-f43b609eac52 req-d37c4b26-b307-4e90-a66d-fde4c0bc52ca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-unplugged-77888806-9b6a-4b3d-a528-863c5c5801a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.114 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '02611d7b-484c-4089-9de6-712e22cef735', 'name': 'tempest-ServerRescueTestJSON-server-1047355164', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000093', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c005f10296264b39a882736d172d2b47', 'user_id': '21487f95977a444e83139b6e5faf83ce', 'hostId': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.115 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.115 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>]
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.139 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.140 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74f9f7b2-d645-4242-b67d-a1b20143a782', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.116089', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd541dc4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.773343244, 'message_signature': 'b1bb0642e6becc36c002f1114fb3004c8940dd0ebcb14592f2e3d698a0c5ce2d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.116089', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd542936-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.773343244, 'message_signature': '9b59050eb61ae153343d795439bca20903f79374dd02cad97cba1b2a045f823e'}]}, 'timestamp': '2026-01-22 22:44:09.140371', '_unique_id': '7de2c296264c4dd496041b0ad3b4bb65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.141 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.183 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.read.latency volume: 114465483 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.184 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.read.latency volume: 443371 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63cda041-518c-4762-8716-e738c29b237c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 114465483, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.142420', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd5ad31c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '1020b05b2c1d9f1a7e9a925dc73a574d27b0d7cbb3c3a84a0fecabd517c74cc8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 443371, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.142420', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd5adede-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': 'd9399316e210bfde4d9d3c54b3b72a460d6133d3ff7d4ef2d4a2f6bf3ce3bf3b'}]}, 'timestamp': '2026-01-22 22:44:09.184366', '_unique_id': '19b837f1bb2a4b4391452dd0bbce2218'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.201 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 02611d7b-484c-4089-9de6-712e22cef735 / tap960537b2-fe inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.202 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae9cef63-9cb1-437c-8c15-80378f5d6bce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.186156', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd5d9fe8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': '2bd73d8ad7815b3e686556149a8b1fa06a3200f836d44f34bee2586ce043473c'}]}, 'timestamp': '2026-01-22 22:44:09.202509', '_unique_id': '5f4bda07769241e4aa78d64406baa011'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.203 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.204 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.204 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76ed9a0f-efa1-4056-a229-06479745bef9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.204381', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd5df7e0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '6f5cb0080b221a6a303ccf9817c9f3a6a192c7d664de27477e7d100adba5a0f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.204381', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd5e0172-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '67b1a2c080a4b8d2b7ca3796ae5b1e11578d9fe3f32e6623700dd51c9e6ccacd'}]}, 'timestamp': '2026-01-22 22:44:09.204912', '_unique_id': '6ca2142b6d624bf0bc005bbdcfcaf8cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.205 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.206 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fb9e82e-2c1b-4651-b488-76a793880693', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.206410', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd5e4722-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': '7d5180d29ae1ab19a236da478c2336aa591addc9d364d0d0f6277c6b979c6c4a'}]}, 'timestamp': '2026-01-22 22:44:09.206697', '_unique_id': '347d88fbaef54d80b8cb6140fbafa606'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.208 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.208 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eab8103a-c2dd-4b1b-a390-b67761b6c7a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.208040', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd5e8606-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '3c3f4d903e5747f576b6b308b7f4d641f4e153726b5f15df975e8fde31362da3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.208040', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd5e8f3e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': 'eba1c25ab150ab8a4cf661784976b83a675094c1ff206a2cdb436b1338179748'}]}, 'timestamp': '2026-01-22 22:44:09.208538', '_unique_id': '1888095033cd4de8ba94b716e901f9df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.210 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '537dd579-5f47-4503-90b4-74f0ece0b9ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.210039', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd5ed688-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': '22547599ded80313a8b531f488bc637cc22377613de16f9b5f3487d307fd6b5b'}]}, 'timestamp': '2026-01-22 22:44:09.210355', '_unique_id': 'd4821e4daf9143998bc5e4eb9e0e81c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.211 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7df9a37-6a36-4324-babe-bf7aa741dd72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.211869', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd5f1d8c-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': '723f38e825078df8a828eaec8db3389b4afd42b3932ac2291d0c96ff85c5023c'}]}, 'timestamp': '2026-01-22 22:44:09.212174', '_unique_id': 'ee1c10388e1f489c84d823681c59fbfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.212 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.213 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.213 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>]
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.213 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97ae50b4-e1b4-4e00-89e9-98e81b9809c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.213841', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd5f6aa8-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': '9059520394ba3e8f42337911ee2a5db4d3a3048f43ce85b264b554085a74453e'}]}, 'timestamp': '2026-01-22 22:44:09.214201', '_unique_id': 'aa2c51ae777c4afdb6ba264185338672'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.214 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.215 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.243 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.243 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 02611d7b-484c-4089-9de6-712e22cef735: ceilometer.compute.pollsters.NoVolumeException
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.244 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.244 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18115b66-f388-4ca9-a4a3-aeb583d6a6bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.244103', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd640b94-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '0a00446abb0b678a8c621c1ac4fe0a9d08c143be9e73e43795168fb20fbbc602'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.244103', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd6419c2-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '9f090ddf08e7beb643ea2ce2f28176614cdefffd2a546dc7706a703fb41a518e'}]}, 'timestamp': '2026-01-22 22:44:09.244902', '_unique_id': '38acaf1eea4e4fb3bc82bbf512ccaa31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.246 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.247 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.247 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e210e048-40d7-4c36-a52c-19fdf6475801', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.247263', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd6485ba-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.773343244, 'message_signature': 'bb6368c6167a3de0fa78321c3fc0da66315110728a289a8831b762886a9069fb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.247263', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd649294-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.773343244, 'message_signature': '85bb832938facfb8cc27676c0bd86cc35b7122dc31212982e1d3faac312f6adb'}]}, 'timestamp': '2026-01-22 22:44:09.247954', '_unique_id': 'e335de4e5a4f4a22b1a99d6cd0ca2dd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.248 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.249 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0eb056fc-e2d5-4063-ba4a-191ef3d01cfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.249540', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd64dd26-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': 'c3f8398efbd561ebc93c40172763590d72bec501e2b72da968937112499f1377'}]}, 'timestamp': '2026-01-22 22:44:09.249901', '_unique_id': 'eb03161130d9427caf13dc0e55c83392'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.250 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.251 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c9203c4-5bbf-48de-ba4b-8ffa388b39f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.251812', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd65360e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': 'a52e9042fa55fc28f791490f6a72ae50e7fec5952cb67e1c3e0e316799b9f46b'}]}, 'timestamp': '2026-01-22 22:44:09.252158', '_unique_id': '9f76e50d156642a9a6c5e82daaa83891'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.252 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.253 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '985257d1-8ea3-4342-8d32-a76fc9878d4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.253817', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd658460-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': 'ae95199f17751fb43da4cfe231ffe8d559a78ce15748dfa6eafea7d097b71bb9'}]}, 'timestamp': '2026-01-22 22:44:09.254190', '_unique_id': '0ecbea5a931840e089ec415867bdd7e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.254 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.255 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.256 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbb0ddbc-ab74-4930-9260-729716c97528', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.255820', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd65d262-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '32bebf8807d3b7523b3ce50be7b55bcf2a308d2d95999289d11104496fa55a1a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.255820', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd65ddd4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '866736d2f3d00fcdcb498a5161bed67414aef9891d4c4ea18e60fca5c934df53'}]}, 'timestamp': '2026-01-22 22:44:09.256429', '_unique_id': '12dec83d33f04556a5d4c64578a7d01f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.257 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.258 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '534f8152-1e52-4e63-81de-8413e2cb0603', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.258323', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd663464-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': 'b343cc9e91eb8f5f7fea38553f80e451d9c7be1d26b81c88da9c494cc8c11e19'}]}, 'timestamp': '2026-01-22 22:44:09.258722', '_unique_id': 'cde5008c0df143cba365799c4f4b83d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.259 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.260 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.260 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/cpu volume: 8120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cb9c882-8521-46d8-8d6d-1ebaa14c2050', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8120000000, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735', 'timestamp': '2026-01-22T22:44:09.260717', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'dd669328-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.900620403, 'message_signature': 'ed1421eef139ae888fed95a009884eb313b1de962333dce63ecfda19e8ca163d'}]}, 'timestamp': '2026-01-22 22:44:09.261084', '_unique_id': '1642b4c012bf45b4ad0a1b0241c0f123'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.261 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.262 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.263 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd09893d-57b6-46df-8624-5f7167328baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.262635', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd66df54-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.773343244, 'message_signature': 'c6cbc5838ff3f0905833f301158268cec8a797a8b3987d57a37b1602473bc611'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.262635', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd66ebc0-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.773343244, 'message_signature': 'b9e3a9af7bdf1ea2535a6da4868c30fbf3d363e35229e216b48d853cb4cb5e25'}]}, 'timestamp': '2026-01-22 22:44:09.263339', '_unique_id': 'b1a59f6b6a214d5ba1fe202a372bd8aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.264 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.265 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a6c76a6-3f3a-47d9-9e53-f0419a679f9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': 'instance-00000093-02611d7b-484c-4089-9de6-712e22cef735-tap960537b2-fe', 'timestamp': '2026-01-22T22:44:09.265153', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'tap960537b2-fe', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:88:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap960537b2-fe'}, 'message_id': 'dd67403e-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.843445475, 'message_signature': '13ce10b904fc3831b88a59c51eee1a72d14efef5ec84975be78a65af2c1bac17'}]}, 'timestamp': '2026-01-22 22:44:09.265549', '_unique_id': '7ee70b3633e4418ea4eb55394b8311b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.266 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.267 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.267 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.267 12 DEBUG ceilometer.compute.pollsters [-] 02611d7b-484c-4089-9de6-712e22cef735/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6b1da04-e0e5-49ea-80fb-dda0647e22bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-vda', 'timestamp': '2026-01-22T22:44:09.267287', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd6793a4-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': 'ed89edfc7383e84d87acd7fdcdd4a73924116ac2254f05c70b74472021cea333'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_name': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_name': None, 'resource_id': '02611d7b-484c-4089-9de6-712e22cef735-sda', 'timestamp': '2026-01-22T22:44:09.267287', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1047355164', 'name': 'instance-00000093', 'instance_id': '02611d7b-484c-4089-9de6-712e22cef735', 'instance_type': 'm1.nano', 'host': '18bcf965d24d8e970f51f0be16063be543e61a515f2c0093bfb20a74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd67a416-f7e3-11f0-9a35-fa163e3d8874', 'monotonic_time': 5486.799703032, 'message_signature': '8c1683b8290b8f60173ea2b5f25b58bcc6860a60d8fba1bc8823b353d8d05f56'}]}, 'timestamp': '2026-01-22 22:44:09.268088', '_unique_id': 'c36c1382e0e1484ebe4a6e1fbd068a30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.268 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.269 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.269 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>]
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.270 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:44:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:44:09.270 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueTestJSON-server-1047355164>]
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.304 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.422 182729 DEBUG nova.network.neutron [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.440 182729 INFO nova.compute.manager [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Took 0.79 seconds to deallocate network for instance.
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.528 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.528 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.591 182729 DEBUG nova.network.neutron [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updated VIF entry in instance network info cache for port 77888806-9b6a-4b3d-a528-863c5c5801a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.592 182729 DEBUG nova.network.neutron [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Updating instance_info_cache with network_info: [{"id": "77888806-9b6a-4b3d-a528-863c5c5801a7", "address": "fa:16:3e:2e:34:7d", "network": {"id": "20c3083d-5059-4bbb-a1bc-ca13d504e79c", "bridge": "br-int", "label": "tempest-network-smoke--2054740130", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77888806-9b", "ovs_interfaceid": "77888806-9b6a-4b3d-a528-863c5c5801a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.624 182729 DEBUG nova.compute.provider_tree [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.628 182729 DEBUG oslo_concurrency.lockutils [req-8af59baa-17aa-4c35-8c2d-d7a7b674db70 req-bb12d523-43e1-40f7-9570-ed717934e82e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1e7db515-c991-4967-b53b-01c33eaadab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.648 182729 DEBUG nova.scheduler.client.report [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.673 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.699 182729 INFO nova.scheduler.client.report [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Deleted allocations for instance 1e7db515-c991-4967-b53b-01c33eaadab2
Jan 22 22:44:09 compute-0 nova_compute[182725]: 2026-01-22 22:44:09.821 182729 DEBUG oslo_concurrency.lockutils [None req-9b26bfcd-cf56-41e4-a277-8c8ceaa4bbf0 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.272 182729 DEBUG nova.compute.manager [req-24862407-ab0a-430f-a1f6-1b2f55682606 req-912e10d1-9fe1-4bcb-8b70-ad7887e48e87 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-deleted-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.273 182729 INFO nova.compute.manager [req-24862407-ab0a-430f-a1f6-1b2f55682606 req-912e10d1-9fe1-4bcb-8b70-ad7887e48e87 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Neutron deleted interface 77888806-9b6a-4b3d-a528-863c5c5801a7; detaching it from the instance and deleting it from the info cache
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.274 182729 DEBUG nova.network.neutron [req-24862407-ab0a-430f-a1f6-1b2f55682606 req-912e10d1-9fe1-4bcb-8b70-ad7887e48e87 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.277 182729 DEBUG nova.compute.manager [req-24862407-ab0a-430f-a1f6-1b2f55682606 req-912e10d1-9fe1-4bcb-8b70-ad7887e48e87 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Detach interface failed, port_id=77888806-9b6a-4b3d-a528-863c5c5801a7, reason: Instance 1e7db515-c991-4967-b53b-01c33eaadab2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.964 182729 DEBUG nova.compute.manager [req-003d8e7d-93b9-4d6d-8193-c368cef9c8d7 req-3f5693b0-99bb-47fa-bf5e-c748bfc8a154 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.964 182729 DEBUG oslo_concurrency.lockutils [req-003d8e7d-93b9-4d6d-8193-c368cef9c8d7 req-3f5693b0-99bb-47fa-bf5e-c748bfc8a154 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.965 182729 DEBUG oslo_concurrency.lockutils [req-003d8e7d-93b9-4d6d-8193-c368cef9c8d7 req-3f5693b0-99bb-47fa-bf5e-c748bfc8a154 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.965 182729 DEBUG oslo_concurrency.lockutils [req-003d8e7d-93b9-4d6d-8193-c368cef9c8d7 req-3f5693b0-99bb-47fa-bf5e-c748bfc8a154 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1e7db515-c991-4967-b53b-01c33eaadab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.965 182729 DEBUG nova.compute.manager [req-003d8e7d-93b9-4d6d-8193-c368cef9c8d7 req-3f5693b0-99bb-47fa-bf5e-c748bfc8a154 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] No waiting events found dispatching network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:44:10 compute-0 nova_compute[182725]: 2026-01-22 22:44:10.966 182729 WARNING nova.compute.manager [req-003d8e7d-93b9-4d6d-8193-c368cef9c8d7 req-3f5693b0-99bb-47fa-bf5e-c748bfc8a154 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Received unexpected event network-vif-plugged-77888806-9b6a-4b3d-a528-863c5c5801a7 for instance with vm_state deleted and task_state None.
Jan 22 22:44:11 compute-0 nova_compute[182725]: 2026-01-22 22:44:11.625 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:11 compute-0 nova_compute[182725]: 2026-01-22 22:44:11.801 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:12.452 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:12.453 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:12.454 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:13 compute-0 nova_compute[182725]: 2026-01-22 22:44:13.565 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:14 compute-0 nova_compute[182725]: 2026-01-22 22:44:14.305 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:16 compute-0 nova_compute[182725]: 2026-01-22 22:44:16.453 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 22:44:18 compute-0 nova_compute[182725]: 2026-01-22 22:44:18.568 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:18 compute-0 kernel: tap960537b2-fe (unregistering): left promiscuous mode
Jan 22 22:44:18 compute-0 NetworkManager[54954]: <info>  [1769121858.6999] device (tap960537b2-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:44:18 compute-0 ovn_controller[94850]: 2026-01-22T22:44:18Z|00602|binding|INFO|Releasing lport 960537b2-fe8a-48ce-ace9-b39c09a20598 from this chassis (sb_readonly=0)
Jan 22 22:44:18 compute-0 ovn_controller[94850]: 2026-01-22T22:44:18Z|00603|binding|INFO|Setting lport 960537b2-fe8a-48ce-ace9-b39c09a20598 down in Southbound
Jan 22 22:44:18 compute-0 nova_compute[182725]: 2026-01-22 22:44:18.704 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:18 compute-0 ovn_controller[94850]: 2026-01-22T22:44:18Z|00604|binding|INFO|Removing iface tap960537b2-fe ovn-installed in OVS
Jan 22 22:44:18 compute-0 nova_compute[182725]: 2026-01-22 22:44:18.706 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.714 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:88:cb 10.100.0.12'], port_security=['fa:16:3e:d5:88:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02611d7b-484c-4089-9de6-712e22cef735', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=960537b2-fe8a-48ce-ace9-b39c09a20598) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.715 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 960537b2-fe8a-48ce-ace9-b39c09a20598 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.716 104215 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.716 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c8072397-4037-4080-a899-304d4e5acca4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:18 compute-0 nova_compute[182725]: 2026-01-22 22:44:18.718 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:18 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000093.scope: Deactivated successfully.
Jan 22 22:44:18 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000093.scope: Consumed 12.556s CPU time.
Jan 22 22:44:18 compute-0 systemd-machined[154006]: Machine qemu-66-instance-00000093 terminated.
Jan 22 22:44:18 compute-0 podman[233185]: 2026-01-22 22:44:18.795608079 +0000 UTC m=+0.066455540 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:44:18 compute-0 podman[233187]: 2026-01-22 22:44:18.801623949 +0000 UTC m=+0.070308266 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:44:18 compute-0 podman[233188]: 2026-01-22 22:44:18.8016529 +0000 UTC m=+0.060516352 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:44:18 compute-0 kernel: tap960537b2-fe: entered promiscuous mode
Jan 22 22:44:18 compute-0 kernel: tap960537b2-fe (unregistering): left promiscuous mode
Jan 22 22:44:18 compute-0 ovn_controller[94850]: 2026-01-22T22:44:18Z|00605|binding|INFO|Claiming lport 960537b2-fe8a-48ce-ace9-b39c09a20598 for this chassis.
Jan 22 22:44:18 compute-0 ovn_controller[94850]: 2026-01-22T22:44:18Z|00606|binding|INFO|960537b2-fe8a-48ce-ace9-b39c09a20598: Claiming fa:16:3e:d5:88:cb 10.100.0.12
Jan 22 22:44:18 compute-0 nova_compute[182725]: 2026-01-22 22:44:18.945 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.952 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:88:cb 10.100.0.12'], port_security=['fa:16:3e:d5:88:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02611d7b-484c-4089-9de6-712e22cef735', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=960537b2-fe8a-48ce-ace9-b39c09a20598) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.953 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 960537b2-fe8a-48ce-ace9-b39c09a20598 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 bound to our chassis
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.953 104215 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.954 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ce495511-fcc0-484d-9eae-a406af314161]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:18 compute-0 ovn_controller[94850]: 2026-01-22T22:44:18Z|00607|binding|INFO|Releasing lport 960537b2-fe8a-48ce-ace9-b39c09a20598 from this chassis (sb_readonly=0)
Jan 22 22:44:18 compute-0 nova_compute[182725]: 2026-01-22 22:44:18.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.967 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:88:cb 10.100.0.12'], port_security=['fa:16:3e:d5:88:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02611d7b-484c-4089-9de6-712e22cef735', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=960537b2-fe8a-48ce-ace9-b39c09a20598) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.967 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 960537b2-fe8a-48ce-ace9-b39c09a20598 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.968 104215 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 22:44:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:18.968 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7c6fc9-e737-409c-9e08-92d608f2c4bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:18 compute-0 nova_compute[182725]: 2026-01-22 22:44:18.973 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.308 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.467 182729 INFO nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance shutdown successfully after 13 seconds.
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.473 182729 INFO nova.virt.libvirt.driver [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance destroyed successfully.
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.473 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'numa_topology' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.492 182729 INFO nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Attempting rescue
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.493 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.496 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.496 182729 INFO nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Creating image(s)
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.497 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.497 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.498 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.498 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.526 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.527 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.538 182729 DEBUG oslo_concurrency.processutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.637 182729 DEBUG oslo_concurrency.processutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.639 182729 DEBUG oslo_concurrency.processutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.677 182729 DEBUG oslo_concurrency.processutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.rescue" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.678 182729 DEBUG oslo_concurrency.lockutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.678 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'migration_context' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.692 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.693 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Start _get_guest_xml network_info=[{"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1371169293-network", "vif_mac": "fa:16:3e:d5:88:cb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.693 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'resources' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.713 182729 WARNING nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.724 182729 DEBUG nova.virt.libvirt.host [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.725 182729 DEBUG nova.virt.libvirt.host [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.729 182729 DEBUG nova.virt.libvirt.host [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.730 182729 DEBUG nova.virt.libvirt.host [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.731 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.731 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.732 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.732 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.732 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.732 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.732 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.733 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.733 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.733 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.733 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.734 182729 DEBUG nova.virt.hardware [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.734 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.750 182729 DEBUG nova.virt.libvirt.vif [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1047355164',display_name='tempest-ServerRescueTestJSON-server-1047355164',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1047355164',id=147,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:44:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-2pi3nzyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:44:00Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=02611d7b-484c-4089-9de6-712e22cef735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1371169293-network", "vif_mac": "fa:16:3e:d5:88:cb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.751 182729 DEBUG nova.network.os_vif_util [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1371169293-network", "vif_mac": "fa:16:3e:d5:88:cb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.752 182729 DEBUG nova.network.os_vif_util [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.752 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.766 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <uuid>02611d7b-484c-4089-9de6-712e22cef735</uuid>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <name>instance-00000093</name>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <nova:name>tempest-ServerRescueTestJSON-server-1047355164</nova:name>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:44:19</nova:creationTime>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:user uuid="21487f95977a444e83139b6e5faf83ce">tempest-ServerRescueTestJSON-697248807-project-member</nova:user>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:project uuid="c005f10296264b39a882736d172d2b47">tempest-ServerRescueTestJSON-697248807</nova:project>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         <nova:port uuid="960537b2-fe8a-48ce-ace9-b39c09a20598">
Jan 22 22:44:19 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <system>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <entry name="serial">02611d7b-484c-4089-9de6-712e22cef735</entry>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <entry name="uuid">02611d7b-484c-4089-9de6-712e22cef735</entry>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </system>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <os>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   </os>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <features>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   </features>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.rescue"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <target dev="vdb" bus="virtio"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config.rescue"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:d5:88:cb"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <target dev="tap960537b2-fe"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/console.log" append="off"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <video>
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </video>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:44:19 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:44:19 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:44:19 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:44:19 compute-0 nova_compute[182725]: </domain>
Jan 22 22:44:19 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.775 182729 INFO nova.virt.libvirt.driver [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance destroyed successfully.
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.824 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.825 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.825 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.825 182729 DEBUG nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No VIF found with MAC fa:16:3e:d5:88:cb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.825 182729 INFO nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Using config drive
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.839 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:19 compute-0 nova_compute[182725]: 2026-01-22 22:44:19.867 182729 DEBUG nova.objects.instance [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'keypairs' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.194 182729 INFO nova.virt.libvirt.driver [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Creating config drive at /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config.rescue
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.199 182729 DEBUG oslo_concurrency.processutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuwrkdh73 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.324 182729 DEBUG oslo_concurrency.processutils [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuwrkdh73" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:44:20 compute-0 kernel: tap960537b2-fe: entered promiscuous mode
Jan 22 22:44:20 compute-0 systemd-udevd[233225]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:44:20 compute-0 NetworkManager[54954]: <info>  [1769121860.3983] manager: (tap960537b2-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 22 22:44:20 compute-0 ovn_controller[94850]: 2026-01-22T22:44:20Z|00608|binding|INFO|Claiming lport 960537b2-fe8a-48ce-ace9-b39c09a20598 for this chassis.
Jan 22 22:44:20 compute-0 ovn_controller[94850]: 2026-01-22T22:44:20Z|00609|binding|INFO|960537b2-fe8a-48ce-ace9-b39c09a20598: Claiming fa:16:3e:d5:88:cb 10.100.0.12
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.397 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:20 compute-0 NetworkManager[54954]: <info>  [1769121860.4099] device (tap960537b2-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:44:20 compute-0 NetworkManager[54954]: <info>  [1769121860.4115] device (tap960537b2-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:44:20 compute-0 ovn_controller[94850]: 2026-01-22T22:44:20Z|00610|binding|INFO|Setting lport 960537b2-fe8a-48ce-ace9-b39c09a20598 ovn-installed in OVS
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.416 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.418 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:20 compute-0 ovn_controller[94850]: 2026-01-22T22:44:20Z|00611|binding|INFO|Setting lport 960537b2-fe8a-48ce-ace9-b39c09a20598 up in Southbound
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.421 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:20 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:20.422 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:88:cb 10.100.0.12'], port_security=['fa:16:3e:d5:88:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02611d7b-484c-4089-9de6-712e22cef735', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=960537b2-fe8a-48ce-ace9-b39c09a20598) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:20 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:20.423 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 960537b2-fe8a-48ce-ace9-b39c09a20598 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 bound to our chassis
Jan 22 22:44:20 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:20.424 104215 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 22:44:20 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:20.425 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0b3f2a-cfbc-4181-a60d-3da65b975008]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:44:20 compute-0 systemd-machined[154006]: New machine qemu-67-instance-00000093.
Jan 22 22:44:20 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-00000093.
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.965 182729 DEBUG nova.virt.libvirt.host [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Removed pending event for 02611d7b-484c-4089-9de6-712e22cef735 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.965 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121860.964826, 02611d7b-484c-4089-9de6-712e22cef735 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.966 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] VM Resumed (Lifecycle Event)
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.983 182729 DEBUG nova.compute.manager [None req-1cf97a93-6eda-43d9-be85-ee9b450499d1 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.991 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:44:20 compute-0 nova_compute[182725]: 2026-01-22 22:44:20.994 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.048 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.048 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121860.9708765, 02611d7b-484c-4089-9de6-712e22cef735 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.048 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] VM Started (Lifecycle Event)
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.075 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.079 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.232 182729 DEBUG nova.compute.manager [req-2ad99c40-3c85-46ea-89be-2826a9365bc9 req-26b1c92f-0870-4430-8f99-7325ff4a0448 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-unplugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.232 182729 DEBUG oslo_concurrency.lockutils [req-2ad99c40-3c85-46ea-89be-2826a9365bc9 req-26b1c92f-0870-4430-8f99-7325ff4a0448 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.233 182729 DEBUG oslo_concurrency.lockutils [req-2ad99c40-3c85-46ea-89be-2826a9365bc9 req-26b1c92f-0870-4430-8f99-7325ff4a0448 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.233 182729 DEBUG oslo_concurrency.lockutils [req-2ad99c40-3c85-46ea-89be-2826a9365bc9 req-26b1c92f-0870-4430-8f99-7325ff4a0448 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.233 182729 DEBUG nova.compute.manager [req-2ad99c40-3c85-46ea-89be-2826a9365bc9 req-26b1c92f-0870-4430-8f99-7325ff4a0448 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] No waiting events found dispatching network-vif-unplugged-960537b2-fe8a-48ce-ace9-b39c09a20598 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:44:21 compute-0 nova_compute[182725]: 2026-01-22 22:44:21.233 182729 WARNING nova.compute.manager [req-2ad99c40-3c85-46ea-89be-2826a9365bc9 req-26b1c92f-0870-4430-8f99-7325ff4a0448 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received unexpected event network-vif-unplugged-960537b2-fe8a-48ce-ace9-b39c09a20598 for instance with vm_state rescued and task_state None.
Jan 22 22:44:22 compute-0 nova_compute[182725]: 2026-01-22 22:44:22.956 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.546 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121848.5460708, 1e7db515-c991-4967-b53b-01c33eaadab2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.547 182729 INFO nova.compute.manager [-] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] VM Stopped (Lifecycle Event)
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.555 182729 DEBUG nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.556 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.556 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.556 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.556 182729 DEBUG nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] No waiting events found dispatching network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.556 182729 WARNING nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received unexpected event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 for instance with vm_state rescued and task_state None.
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.557 182729 DEBUG nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.557 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.557 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.558 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.558 182729 DEBUG nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] No waiting events found dispatching network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.558 182729 WARNING nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received unexpected event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 for instance with vm_state rescued and task_state None.
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.558 182729 DEBUG nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.559 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.559 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.559 182729 DEBUG oslo_concurrency.lockutils [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.559 182729 DEBUG nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] No waiting events found dispatching network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.559 182729 WARNING nova.compute.manager [req-98b2b305-af7b-4059-8733-f7d295abb5fa req-edf3ac32-8053-40f4-bf31-a4cde1a6cd1c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received unexpected event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 for instance with vm_state rescued and task_state None.
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.569 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:23 compute-0 nova_compute[182725]: 2026-01-22 22:44:23.572 182729 DEBUG nova.compute.manager [None req-4cef9a9d-7c0b-433d-af3e-e45b3fc8eb71 - - - - - -] [instance: 1e7db515-c991-4967-b53b-01c33eaadab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:44:24 compute-0 nova_compute[182725]: 2026-01-22 22:44:24.310 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:27 compute-0 nova_compute[182725]: 2026-01-22 22:44:27.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:27 compute-0 nova_compute[182725]: 2026-01-22 22:44:27.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:44:28 compute-0 nova_compute[182725]: 2026-01-22 22:44:28.573 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:29 compute-0 nova_compute[182725]: 2026-01-22 22:44:29.312 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:30 compute-0 nova_compute[182725]: 2026-01-22 22:44:30.904 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:30 compute-0 nova_compute[182725]: 2026-01-22 22:44:30.927 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:30 compute-0 nova_compute[182725]: 2026-01-22 22:44:30.928 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:30 compute-0 nova_compute[182725]: 2026-01-22 22:44:30.928 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:30 compute-0 nova_compute[182725]: 2026-01-22 22:44:30.928 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:44:30 compute-0 nova_compute[182725]: 2026-01-22 22:44:30.985 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.081 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.rescue --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.082 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.158 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk.rescue --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.160 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.225 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.226 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.304 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.446 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.448 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5543MB free_disk=73.3027229309082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.448 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.449 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.571 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 02611d7b-484c-4089-9de6-712e22cef735 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.572 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.572 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.721 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.759 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.802 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:44:31 compute-0 nova_compute[182725]: 2026-01-22 22:44:31.803 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:44:32 compute-0 nova_compute[182725]: 2026-01-22 22:44:32.788 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:32 compute-0 nova_compute[182725]: 2026-01-22 22:44:32.789 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:44:32 compute-0 nova_compute[182725]: 2026-01-22 22:44:32.789 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:44:33 compute-0 nova_compute[182725]: 2026-01-22 22:44:33.575 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:33 compute-0 nova_compute[182725]: 2026-01-22 22:44:33.767 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:44:33 compute-0 nova_compute[182725]: 2026-01-22 22:44:33.767 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:44:33 compute-0 nova_compute[182725]: 2026-01-22 22:44:33.767 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:44:33 compute-0 nova_compute[182725]: 2026-01-22 22:44:33.767 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:44:34 compute-0 podman[233329]: 2026-01-22 22:44:34.132891528 +0000 UTC m=+0.062468581 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 22:44:34 compute-0 nova_compute[182725]: 2026-01-22 22:44:34.314 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.006 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Updating instance_info_cache with network_info: [{"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.020 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-02611d7b-484c-4089-9de6-712e22cef735" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.020 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.021 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.021 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.021 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.022 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:44:37 compute-0 nova_compute[182725]: 2026-01-22 22:44:37.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:38 compute-0 nova_compute[182725]: 2026-01-22 22:44:38.579 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:38 compute-0 nova_compute[182725]: 2026-01-22 22:44:38.941 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:44:39 compute-0 podman[233352]: 2026-01-22 22:44:39.148624135 +0000 UTC m=+0.070322828 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, name=ubi9-minimal)
Jan 22 22:44:39 compute-0 podman[233351]: 2026-01-22 22:44:39.179242109 +0000 UTC m=+0.103069875 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:44:39 compute-0 nova_compute[182725]: 2026-01-22 22:44:39.316 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:42.026 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:44:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:42.026 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:44:42 compute-0 nova_compute[182725]: 2026-01-22 22:44:42.026 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:44:42.027 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:44:43 compute-0 nova_compute[182725]: 2026-01-22 22:44:43.584 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:44 compute-0 nova_compute[182725]: 2026-01-22 22:44:44.319 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:48 compute-0 nova_compute[182725]: 2026-01-22 22:44:48.588 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:49 compute-0 podman[233400]: 2026-01-22 22:44:49.142932691 +0000 UTC m=+0.069207153 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 22:44:49 compute-0 podman[233399]: 2026-01-22 22:44:49.151662178 +0000 UTC m=+0.074000352 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:44:49 compute-0 podman[233401]: 2026-01-22 22:44:49.169764719 +0000 UTC m=+0.093786935 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:44:49 compute-0 nova_compute[182725]: 2026-01-22 22:44:49.320 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:53 compute-0 nova_compute[182725]: 2026-01-22 22:44:53.592 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:54 compute-0 nova_compute[182725]: 2026-01-22 22:44:54.322 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:58 compute-0 nova_compute[182725]: 2026-01-22 22:44:58.594 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:44:59 compute-0 nova_compute[182725]: 2026-01-22 22:44:59.324 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:03 compute-0 nova_compute[182725]: 2026-01-22 22:45:03.597 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:03 compute-0 nova_compute[182725]: 2026-01-22 22:45:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:03 compute-0 nova_compute[182725]: 2026-01-22 22:45:03.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:45:03 compute-0 nova_compute[182725]: 2026-01-22 22:45:03.903 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:45:04 compute-0 nova_compute[182725]: 2026-01-22 22:45:04.326 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:05 compute-0 podman[233468]: 2026-01-22 22:45:05.141184229 +0000 UTC m=+0.070218638 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.450 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.450 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.451 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.451 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.451 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.463 182729 INFO nova.compute.manager [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Terminating instance
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.479 182729 DEBUG nova.compute.manager [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:45:07 compute-0 kernel: tap960537b2-fe (unregistering): left promiscuous mode
Jan 22 22:45:07 compute-0 NetworkManager[54954]: <info>  [1769121907.5758] device (tap960537b2-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:45:07 compute-0 ovn_controller[94850]: 2026-01-22T22:45:07Z|00612|binding|INFO|Releasing lport 960537b2-fe8a-48ce-ace9-b39c09a20598 from this chassis (sb_readonly=0)
Jan 22 22:45:07 compute-0 ovn_controller[94850]: 2026-01-22T22:45:07Z|00613|binding|INFO|Setting lport 960537b2-fe8a-48ce-ace9-b39c09a20598 down in Southbound
Jan 22 22:45:07 compute-0 ovn_controller[94850]: 2026-01-22T22:45:07Z|00614|binding|INFO|Removing iface tap960537b2-fe ovn-installed in OVS
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.580 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:07.590 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:88:cb 10.100.0.12'], port_security=['fa:16:3e:d5:88:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '02611d7b-484c-4089-9de6-712e22cef735', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=960537b2-fe8a-48ce-ace9-b39c09a20598) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:45:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:07.593 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 960537b2-fe8a-48ce-ace9-b39c09a20598 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis
Jan 22 22:45:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:07.595 104215 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 22:45:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:07.596 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b7833663-e6f1-4e02-9505-480884faeb32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.598 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:07 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000093.scope: Deactivated successfully.
Jan 22 22:45:07 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000093.scope: Consumed 13.603s CPU time.
Jan 22 22:45:07 compute-0 systemd-machined[154006]: Machine qemu-67-instance-00000093 terminated.
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.767 182729 INFO nova.virt.libvirt.driver [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Instance destroyed successfully.
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.768 182729 DEBUG nova.objects.instance [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'resources' on Instance uuid 02611d7b-484c-4089-9de6-712e22cef735 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.789 182729 DEBUG nova.virt.libvirt.vif [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1047355164',display_name='tempest-ServerRescueTestJSON-server-1047355164',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1047355164',id=147,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:44:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-2pi3nzyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:44:21Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=02611d7b-484c-4089-9de6-712e22cef735,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.789 182729 DEBUG nova.network.os_vif_util [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "960537b2-fe8a-48ce-ace9-b39c09a20598", "address": "fa:16:3e:d5:88:cb", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960537b2-fe", "ovs_interfaceid": "960537b2-fe8a-48ce-ace9-b39c09a20598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.789 182729 DEBUG nova.network.os_vif_util [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.790 182729 DEBUG os_vif [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.791 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.792 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap960537b2-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.793 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.794 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.796 182729 INFO os_vif [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:88:cb,bridge_name='br-int',has_traffic_filtering=True,id=960537b2-fe8a-48ce-ace9-b39c09a20598,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960537b2-fe')
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.797 182729 INFO nova.virt.libvirt.driver [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Deleting instance files /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735_del
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.797 182729 INFO nova.virt.libvirt.driver [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Deletion of /var/lib/nova/instances/02611d7b-484c-4089-9de6-712e22cef735_del complete
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.872 182729 INFO nova.compute.manager [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.872 182729 DEBUG oslo.service.loopingcall [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.872 182729 DEBUG nova.compute.manager [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:45:07 compute-0 nova_compute[182725]: 2026-01-22 22:45:07.873 182729 DEBUG nova.network.neutron [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.269 182729 DEBUG nova.compute.manager [req-504386ad-2321-41b1-bc88-099f5a231d9f req-af03b695-a299-480c-923c-8f16303bf913 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-unplugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.270 182729 DEBUG oslo_concurrency.lockutils [req-504386ad-2321-41b1-bc88-099f5a231d9f req-af03b695-a299-480c-923c-8f16303bf913 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.270 182729 DEBUG oslo_concurrency.lockutils [req-504386ad-2321-41b1-bc88-099f5a231d9f req-af03b695-a299-480c-923c-8f16303bf913 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.270 182729 DEBUG oslo_concurrency.lockutils [req-504386ad-2321-41b1-bc88-099f5a231d9f req-af03b695-a299-480c-923c-8f16303bf913 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.270 182729 DEBUG nova.compute.manager [req-504386ad-2321-41b1-bc88-099f5a231d9f req-af03b695-a299-480c-923c-8f16303bf913 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] No waiting events found dispatching network-vif-unplugged-960537b2-fe8a-48ce-ace9-b39c09a20598 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.271 182729 DEBUG nova.compute.manager [req-504386ad-2321-41b1-bc88-099f5a231d9f req-af03b695-a299-480c-923c-8f16303bf913 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-unplugged-960537b2-fe8a-48ce-ace9-b39c09a20598 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.509 182729 DEBUG nova.network.neutron [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.530 182729 INFO nova.compute.manager [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Took 0.66 seconds to deallocate network for instance.
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.704 182729 DEBUG nova.compute.manager [req-0367d431-6f7a-4f28-b2e3-7f16b7bb3757 req-d3df1faf-669a-4ece-b465-ae2a2ec737ca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-deleted-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.797 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.797 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.856 182729 DEBUG nova.compute.provider_tree [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.870 182729 DEBUG nova.scheduler.client.report [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.888 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.909 182729 INFO nova.scheduler.client.report [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Deleted allocations for instance 02611d7b-484c-4089-9de6-712e22cef735
Jan 22 22:45:08 compute-0 nova_compute[182725]: 2026-01-22 22:45:08.973 182729 DEBUG oslo_concurrency.lockutils [None req-f0e7eba9-a27f-4b70-b8da-812cd2ce5bda 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:09 compute-0 nova_compute[182725]: 2026-01-22 22:45:09.328 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:10 compute-0 podman[233518]: 2026-01-22 22:45:10.148084151 +0000 UTC m=+0.068536436 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Jan 22 22:45:10 compute-0 podman[233517]: 2026-01-22 22:45:10.186738192 +0000 UTC m=+0.115882043 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 22:45:10 compute-0 nova_compute[182725]: 2026-01-22 22:45:10.379 182729 DEBUG nova.compute.manager [req-ed688ea8-0ecb-4bc4-a477-9ec188af37d7 req-7e9535d6-6dec-4aad-a3d6-35bad8ff02c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:10 compute-0 nova_compute[182725]: 2026-01-22 22:45:10.380 182729 DEBUG oslo_concurrency.lockutils [req-ed688ea8-0ecb-4bc4-a477-9ec188af37d7 req-7e9535d6-6dec-4aad-a3d6-35bad8ff02c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "02611d7b-484c-4089-9de6-712e22cef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:10 compute-0 nova_compute[182725]: 2026-01-22 22:45:10.380 182729 DEBUG oslo_concurrency.lockutils [req-ed688ea8-0ecb-4bc4-a477-9ec188af37d7 req-7e9535d6-6dec-4aad-a3d6-35bad8ff02c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:10 compute-0 nova_compute[182725]: 2026-01-22 22:45:10.381 182729 DEBUG oslo_concurrency.lockutils [req-ed688ea8-0ecb-4bc4-a477-9ec188af37d7 req-7e9535d6-6dec-4aad-a3d6-35bad8ff02c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "02611d7b-484c-4089-9de6-712e22cef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:10 compute-0 nova_compute[182725]: 2026-01-22 22:45:10.381 182729 DEBUG nova.compute.manager [req-ed688ea8-0ecb-4bc4-a477-9ec188af37d7 req-7e9535d6-6dec-4aad-a3d6-35bad8ff02c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] No waiting events found dispatching network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:45:10 compute-0 nova_compute[182725]: 2026-01-22 22:45:10.382 182729 WARNING nova.compute.manager [req-ed688ea8-0ecb-4bc4-a477-9ec188af37d7 req-7e9535d6-6dec-4aad-a3d6-35bad8ff02c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Received unexpected event network-vif-plugged-960537b2-fe8a-48ce-ace9-b39c09a20598 for instance with vm_state deleted and task_state None.
Jan 22 22:45:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:12.453 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:12.453 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:12.453 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:12 compute-0 nova_compute[182725]: 2026-01-22 22:45:12.795 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:14 compute-0 nova_compute[182725]: 2026-01-22 22:45:14.330 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:15 compute-0 nova_compute[182725]: 2026-01-22 22:45:15.894 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:17 compute-0 nova_compute[182725]: 2026-01-22 22:45:17.799 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:19 compute-0 nova_compute[182725]: 2026-01-22 22:45:19.333 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:20 compute-0 podman[233566]: 2026-01-22 22:45:20.135192041 +0000 UTC m=+0.061252375 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:45:20 compute-0 podman[233567]: 2026-01-22 22:45:20.16770232 +0000 UTC m=+0.083134520 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 22:45:20 compute-0 podman[233568]: 2026-01-22 22:45:20.189983174 +0000 UTC m=+0.096653596 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:45:22 compute-0 nova_compute[182725]: 2026-01-22 22:45:22.766 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121907.7652147, 02611d7b-484c-4089-9de6-712e22cef735 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:22 compute-0 nova_compute[182725]: 2026-01-22 22:45:22.767 182729 INFO nova.compute.manager [-] [instance: 02611d7b-484c-4089-9de6-712e22cef735] VM Stopped (Lifecycle Event)
Jan 22 22:45:22 compute-0 nova_compute[182725]: 2026-01-22 22:45:22.803 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:22 compute-0 nova_compute[182725]: 2026-01-22 22:45:22.807 182729 DEBUG nova.compute.manager [None req-cb730d4f-df2a-40a4-923a-ed93bc0c2271 - - - - - -] [instance: 02611d7b-484c-4089-9de6-712e22cef735] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:22 compute-0 nova_compute[182725]: 2026-01-22 22:45:22.897 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:24 compute-0 nova_compute[182725]: 2026-01-22 22:45:24.335 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:27 compute-0 nova_compute[182725]: 2026-01-22 22:45:27.806 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:29 compute-0 nova_compute[182725]: 2026-01-22 22:45:29.337 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.158 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.158 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.177 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.286 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.286 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.291 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.291 182729 INFO nova.compute.claims [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.405 182729 DEBUG nova.compute.provider_tree [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.418 182729 DEBUG nova.scheduler.client.report [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.435 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.436 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.501 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.502 182729 DEBUG nova.network.neutron [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.519 182729 INFO nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.539 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.658 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.659 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.659 182729 INFO nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Creating image(s)
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.660 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.660 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.661 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.673 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.731 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.732 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.733 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.750 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.811 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.812 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.863 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.864 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.865 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.901 182729 DEBUG nova.policy [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.908 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.908 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.909 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.920 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.921 182729 DEBUG nova.virt.disk.api [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.922 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.943 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.946 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.947 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.947 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.985 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.987 182729 DEBUG nova.virt.disk.api [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:45:31 compute-0 nova_compute[182725]: 2026-01-22 22:45:31.987 182729 DEBUG nova.objects.instance [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid ae2d061e-c057-4279-86d8-9e40ae86e1ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.002 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.003 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Ensure instance console log exists: /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.003 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.004 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.004 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.155 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.157 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.33221435546875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.157 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.157 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.223 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance ae2d061e-c057-4279-86d8-9e40ae86e1ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.223 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.224 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.267 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.283 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.308 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.308 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.723 182729 DEBUG nova.network.neutron [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Successfully updated port: 02f58f6e-918c-4adc-af25-0761fe039b6d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.762 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.763 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.763 182729 DEBUG nova.network.neutron [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.810 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.926 182729 DEBUG nova.compute.manager [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received event network-changed-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.926 182729 DEBUG nova.compute.manager [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Refreshing instance network info cache due to event network-changed-02f58f6e-918c-4adc-af25-0761fe039b6d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:45:32 compute-0 nova_compute[182725]: 2026-01-22 22:45:32.927 182729 DEBUG oslo_concurrency.lockutils [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:45:33 compute-0 nova_compute[182725]: 2026-01-22 22:45:33.005 182729 DEBUG nova.network.neutron [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.287 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.753 182729 DEBUG nova.network.neutron [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Updating instance_info_cache with network_info: [{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.774 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.774 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Instance network_info: |[{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.775 182729 DEBUG oslo_concurrency.lockutils [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.775 182729 DEBUG nova.network.neutron [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Refreshing network info cache for port 02f58f6e-918c-4adc-af25-0761fe039b6d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.780 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Start _get_guest_xml network_info=[{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.785 182729 WARNING nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.793 182729 DEBUG nova.virt.libvirt.host [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.794 182729 DEBUG nova.virt.libvirt.host [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.798 182729 DEBUG nova.virt.libvirt.host [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.799 182729 DEBUG nova.virt.libvirt.host [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.801 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.801 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.802 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.802 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.803 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.803 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.804 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.804 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.805 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.805 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.805 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.806 182729 DEBUG nova.virt.hardware [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.812 182729 DEBUG nova.virt.libvirt.vif [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1928743683',display_name='tempest-TestNetworkBasicOps-server-1928743683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1928743683',id=153,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHAfT6r2m5pigYuIhbFw3DjS1lDwvZgeGPaMSPx9kMovGmyIaN7XWnz+1OaCliYMuLJZlzfUj+OLNZYiyWN1LDjR/cVp40EOhzAWlyMclPPaMWHRUYAQ0qF8GyB5jOs6TA==',key_name='tempest-TestNetworkBasicOps-257162631',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-zhhnok0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:31Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=ae2d061e-c057-4279-86d8-9e40ae86e1ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.812 182729 DEBUG nova.network.os_vif_util [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.814 182729 DEBUG nova.network.os_vif_util [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.816 182729 DEBUG nova.objects.instance [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid ae2d061e-c057-4279-86d8-9e40ae86e1ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.834 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <uuid>ae2d061e-c057-4279-86d8-9e40ae86e1ef</uuid>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <name>instance-00000099</name>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkBasicOps-server-1928743683</nova:name>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:45:34</nova:creationTime>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         <nova:port uuid="02f58f6e-918c-4adc-af25-0761fe039b6d">
Jan 22 22:45:34 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <system>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <entry name="serial">ae2d061e-c057-4279-86d8-9e40ae86e1ef</entry>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <entry name="uuid">ae2d061e-c057-4279-86d8-9e40ae86e1ef</entry>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </system>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <os>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   </os>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <features>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   </features>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk.config"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:2c:b6:a6"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <target dev="tap02f58f6e-91"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/console.log" append="off"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <video>
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </video>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:45:34 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:45:34 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:45:34 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:45:34 compute-0 nova_compute[182725]: </domain>
Jan 22 22:45:34 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.836 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Preparing to wait for external event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.837 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.838 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.838 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.840 182729 DEBUG nova.virt.libvirt.vif [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1928743683',display_name='tempest-TestNetworkBasicOps-server-1928743683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1928743683',id=153,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHAfT6r2m5pigYuIhbFw3DjS1lDwvZgeGPaMSPx9kMovGmyIaN7XWnz+1OaCliYMuLJZlzfUj+OLNZYiyWN1LDjR/cVp40EOhzAWlyMclPPaMWHRUYAQ0qF8GyB5jOs6TA==',key_name='tempest-TestNetworkBasicOps-257162631',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-zhhnok0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:31Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=ae2d061e-c057-4279-86d8-9e40ae86e1ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.841 182729 DEBUG nova.network.os_vif_util [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.841 182729 DEBUG nova.network.os_vif_util [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.842 182729 DEBUG os_vif [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.842 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.842 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.843 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.847 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.847 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02f58f6e-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.848 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02f58f6e-91, col_values=(('external_ids', {'iface-id': '02f58f6e-918c-4adc-af25-0761fe039b6d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:b6:a6', 'vm-uuid': 'ae2d061e-c057-4279-86d8-9e40ae86e1ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.849 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:34 compute-0 NetworkManager[54954]: <info>  [1769121934.8504] manager: (tap02f58f6e-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.853 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.856 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.857 182729 INFO os_vif [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91')
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.911 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.912 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.912 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:2c:b6:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:45:34 compute-0 nova_compute[182725]: 2026-01-22 22:45:34.912 182729 INFO nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Using config drive
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.343 182729 INFO nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Creating config drive at /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk.config
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.354 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp119c51_4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.499 182729 DEBUG oslo_concurrency.processutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp119c51_4" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:35 compute-0 kernel: tap02f58f6e-91: entered promiscuous mode
Jan 22 22:45:35 compute-0 NetworkManager[54954]: <info>  [1769121935.6221] manager: (tap02f58f6e-91): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Jan 22 22:45:35 compute-0 ovn_controller[94850]: 2026-01-22T22:45:35Z|00615|binding|INFO|Claiming lport 02f58f6e-918c-4adc-af25-0761fe039b6d for this chassis.
Jan 22 22:45:35 compute-0 ovn_controller[94850]: 2026-01-22T22:45:35Z|00616|binding|INFO|02f58f6e-918c-4adc-af25-0761fe039b6d: Claiming fa:16:3e:2c:b6:a6 10.100.0.12
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.622 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.629 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.639 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:b6:a6 10.100.0.12'], port_security=['fa:16:3e:2c:b6:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae2d061e-c057-4279-86d8-9e40ae86e1ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57473ab8-82ff-44c6-9161-154974021c91', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '28858496-9de7-4d51-8064-4ba52669cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d879b131-bf41-479d-8ea2-01de2458b7e4, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=02f58f6e-918c-4adc-af25-0761fe039b6d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.641 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 02f58f6e-918c-4adc-af25-0761fe039b6d in datapath 57473ab8-82ff-44c6-9161-154974021c91 bound to our chassis
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.644 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57473ab8-82ff-44c6-9161-154974021c91
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.662 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef55869-7eb0-4ade-aa6f-1365343d3caa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.663 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap57473ab8-81 in ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.668 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap57473ab8-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.669 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7c74ca64-222d-449f-b666-21d3a1e0e618]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.670 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ec45ec-bbf5-4733-a6d4-d7d6758acf39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 systemd-machined[154006]: New machine qemu-68-instance-00000099.
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.688 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[10c0fe4f-9642-4c7e-b75e-79cd7ec3891c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.708 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:35 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-00000099.
Jan 22 22:45:35 compute-0 ovn_controller[94850]: 2026-01-22T22:45:35Z|00617|binding|INFO|Setting lport 02f58f6e-918c-4adc-af25-0761fe039b6d ovn-installed in OVS
Jan 22 22:45:35 compute-0 ovn_controller[94850]: 2026-01-22T22:45:35Z|00618|binding|INFO|Setting lport 02f58f6e-918c-4adc-af25-0761fe039b6d up in Southbound
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.722 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.719 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3a725f15-9325-41ea-9260-cd8a5ea00b0c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 systemd-udevd[233689]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:45:35 compute-0 NetworkManager[54954]: <info>  [1769121935.7433] device (tap02f58f6e-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:45:35 compute-0 NetworkManager[54954]: <info>  [1769121935.7441] device (tap02f58f6e-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:45:35 compute-0 podman[233663]: 2026-01-22 22:45:35.760351036 +0000 UTC m=+0.146057875 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.777 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[91fc0c9a-137b-49b3-9871-601500571083]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 systemd-udevd[233693]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:45:35 compute-0 NetworkManager[54954]: <info>  [1769121935.7860] manager: (tap57473ab8-80): new Veth device (/org/freedesktop/NetworkManager/Devices/278)
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.786 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f11b836c-0b6b-4e47-82c9-c68032792148]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.836 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fcfba1a9-fa8b-41e5-900a-b4b8989eccfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.840 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[39a0bd4f-93e8-46e1-b66e-05af94b5b8e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 NetworkManager[54954]: <info>  [1769121935.8723] device (tap57473ab8-80): carrier: link connected
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.884 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[287583a7-7446-490d-8d8c-8d2106e54891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.912 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ec4330-c73f-4fc4-b665-d96b13a87ece]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57473ab8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:26:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557347, 'reachable_time': 34079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233720, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.936 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[23bc6820-0c43-47ed-bd55-cc702fd5931b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe95:263c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557347, 'tstamp': 557347}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233721, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:35.962 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[15651716-7d04-4912-abea-203ce77a0ac5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57473ab8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:26:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557347, 'reachable_time': 34079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233722, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.996 182729 DEBUG nova.compute.manager [req-4bad8323-f9e4-4576-971a-db446c6f0f86 req-cb052b88-49ea-4cc9-b59c-2acc77016607 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.997 182729 DEBUG oslo_concurrency.lockutils [req-4bad8323-f9e4-4576-971a-db446c6f0f86 req-cb052b88-49ea-4cc9-b59c-2acc77016607 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.998 182729 DEBUG oslo_concurrency.lockutils [req-4bad8323-f9e4-4576-971a-db446c6f0f86 req-cb052b88-49ea-4cc9-b59c-2acc77016607 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.998 182729 DEBUG oslo_concurrency.lockutils [req-4bad8323-f9e4-4576-971a-db446c6f0f86 req-cb052b88-49ea-4cc9-b59c-2acc77016607 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:35 compute-0 nova_compute[182725]: 2026-01-22 22:45:35.999 182729 DEBUG nova.compute.manager [req-4bad8323-f9e4-4576-971a-db446c6f0f86 req-cb052b88-49ea-4cc9-b59c-2acc77016607 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Processing event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.012 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c58b5b4d-d822-45cb-993e-98ebc375bcb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.122 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9608428b-0d30-4d8a-aebd-7825f5609eec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.124 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57473ab8-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.125 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.126 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57473ab8-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:36 compute-0 kernel: tap57473ab8-80: entered promiscuous mode
Jan 22 22:45:36 compute-0 NetworkManager[54954]: <info>  [1769121936.1318] manager: (tap57473ab8-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.132 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57473ab8-80, col_values=(('external_ids', {'iface-id': '70e143e4-e907-4d6c-9423-8967adb571db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.133 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:36 compute-0 ovn_controller[94850]: 2026-01-22T22:45:36Z|00619|binding|INFO|Releasing lport 70e143e4-e907-4d6c-9423-8967adb571db from this chassis (sb_readonly=0)
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.149 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.150 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/57473ab8-82ff-44c6-9161-154974021c91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/57473ab8-82ff-44c6-9161-154974021c91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.151 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f2678ec1-9490-44c4-9fae-387bc4cfb030]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.153 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-57473ab8-82ff-44c6-9161-154974021c91
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/57473ab8-82ff-44c6-9161-154974021c91.pid.haproxy
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 57473ab8-82ff-44c6-9161-154974021c91
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:45:36 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:36.156 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'env', 'PROCESS_TAG=haproxy-57473ab8-82ff-44c6-9161-154974021c91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/57473ab8-82ff-44c6-9161-154974021c91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.195 182729 DEBUG nova.network.neutron [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Updated VIF entry in instance network info cache for port 02f58f6e-918c-4adc-af25-0761fe039b6d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.196 182729 DEBUG nova.network.neutron [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Updating instance_info_cache with network_info: [{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.217 182729 DEBUG oslo_concurrency.lockutils [req-284c1063-0f42-468a-8c76-1ffffe1916fe req-b251b54d-e873-433c-8153-96a3149bd047 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.406 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121936.405234, ae2d061e-c057-4279-86d8-9e40ae86e1ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.407 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] VM Started (Lifecycle Event)
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.411 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.415 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.420 182729 INFO nova.virt.libvirt.driver [-] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Instance spawned successfully.
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.421 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.427 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.431 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.446 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.446 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.447 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.448 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.449 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.450 182729 DEBUG nova.virt.libvirt.driver [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.457 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.457 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121936.4055226, ae2d061e-c057-4279-86d8-9e40ae86e1ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.458 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] VM Paused (Lifecycle Event)
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.486 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.491 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121936.4143388, ae2d061e-c057-4279-86d8-9e40ae86e1ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.491 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] VM Resumed (Lifecycle Event)
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.512 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.516 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.521 182729 INFO nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Took 4.86 seconds to spawn the instance on the hypervisor.
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.522 182729 DEBUG nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.542 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.601 182729 INFO nova.compute.manager [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Took 5.35 seconds to build instance.
Jan 22 22:45:36 compute-0 podman[233760]: 2026-01-22 22:45:36.645209774 +0000 UTC m=+0.074751641 container create 5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:45:36 compute-0 nova_compute[182725]: 2026-01-22 22:45:36.670 182729 DEBUG oslo_concurrency.lockutils [None req-b536d216-91dc-4167-9383-512b718150d7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:36 compute-0 systemd[1]: Started libpod-conmon-5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71.scope.
Jan 22 22:45:36 compute-0 podman[233760]: 2026-01-22 22:45:36.609399483 +0000 UTC m=+0.038941440 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:45:36 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:45:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/676a9468e91354537e615dcd5f70d519637690662f31f7c2351e8e7a3a643e32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:45:36 compute-0 podman[233760]: 2026-01-22 22:45:36.748474233 +0000 UTC m=+0.178016130 container init 5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:45:36 compute-0 podman[233760]: 2026-01-22 22:45:36.755096888 +0000 UTC m=+0.184638765 container start 5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:45:36 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [NOTICE]   (233779) : New worker (233781) forked
Jan 22 22:45:36 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [NOTICE]   (233779) : Loading success.
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.073 182729 DEBUG nova.compute.manager [req-98ff63c2-0d0f-4585-b031-d96dde07b5a2 req-aaa77cb9-1728-42b3-8835-6b948308a8e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.074 182729 DEBUG oslo_concurrency.lockutils [req-98ff63c2-0d0f-4585-b031-d96dde07b5a2 req-aaa77cb9-1728-42b3-8835-6b948308a8e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.075 182729 DEBUG oslo_concurrency.lockutils [req-98ff63c2-0d0f-4585-b031-d96dde07b5a2 req-aaa77cb9-1728-42b3-8835-6b948308a8e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.075 182729 DEBUG oslo_concurrency.lockutils [req-98ff63c2-0d0f-4585-b031-d96dde07b5a2 req-aaa77cb9-1728-42b3-8835-6b948308a8e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.076 182729 DEBUG nova.compute.manager [req-98ff63c2-0d0f-4585-b031-d96dde07b5a2 req-aaa77cb9-1728-42b3-8835-6b948308a8e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] No waiting events found dispatching network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.076 182729 WARNING nova.compute.manager [req-98ff63c2-0d0f-4585-b031-d96dde07b5a2 req-aaa77cb9-1728-42b3-8835-6b948308a8e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received unexpected event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d for instance with vm_state active and task_state None.
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:38 compute-0 nova_compute[182725]: 2026-01-22 22:45:38.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:45:39 compute-0 nova_compute[182725]: 2026-01-22 22:45:39.342 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:39 compute-0 nova_compute[182725]: 2026-01-22 22:45:39.850 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:39 compute-0 nova_compute[182725]: 2026-01-22 22:45:39.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:41 compute-0 podman[233791]: 2026-01-22 22:45:41.131171384 +0000 UTC m=+0.060571238 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Jan 22 22:45:41 compute-0 podman[233790]: 2026-01-22 22:45:41.186838669 +0000 UTC m=+0.118296045 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:45:43 compute-0 NetworkManager[54954]: <info>  [1769121943.0565] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 22 22:45:43 compute-0 NetworkManager[54954]: <info>  [1769121943.0576] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.063 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.214 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 ovn_controller[94850]: 2026-01-22T22:45:43Z|00620|binding|INFO|Releasing lport 70e143e4-e907-4d6c-9423-8967adb571db from this chassis (sb_readonly=0)
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.238 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.373 182729 DEBUG nova.compute.manager [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received event network-changed-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.374 182729 DEBUG nova.compute.manager [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Refreshing instance network info cache due to event network-changed-02f58f6e-918c-4adc-af25-0761fe039b6d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.375 182729 DEBUG oslo_concurrency.lockutils [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.376 182729 DEBUG oslo_concurrency.lockutils [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.376 182729 DEBUG nova.network.neutron [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Refreshing network info cache for port 02f58f6e-918c-4adc-af25-0761fe039b6d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.611 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.612 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.612 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.613 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.613 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.628 182729 INFO nova.compute.manager [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Terminating instance
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.642 182729 DEBUG nova.compute.manager [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:45:43 compute-0 kernel: tap02f58f6e-91 (unregistering): left promiscuous mode
Jan 22 22:45:43 compute-0 NetworkManager[54954]: <info>  [1769121943.6607] device (tap02f58f6e-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:45:43 compute-0 ovn_controller[94850]: 2026-01-22T22:45:43Z|00621|binding|INFO|Releasing lport 02f58f6e-918c-4adc-af25-0761fe039b6d from this chassis (sb_readonly=0)
Jan 22 22:45:43 compute-0 ovn_controller[94850]: 2026-01-22T22:45:43Z|00622|binding|INFO|Setting lport 02f58f6e-918c-4adc-af25-0761fe039b6d down in Southbound
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.667 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 ovn_controller[94850]: 2026-01-22T22:45:43Z|00623|binding|INFO|Removing iface tap02f58f6e-91 ovn-installed in OVS
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.670 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.675 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:b6:a6 10.100.0.12'], port_security=['fa:16:3e:2c:b6:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ae2d061e-c057-4279-86d8-9e40ae86e1ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57473ab8-82ff-44c6-9161-154974021c91', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '28858496-9de7-4d51-8064-4ba52669cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d879b131-bf41-479d-8ea2-01de2458b7e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=02f58f6e-918c-4adc-af25-0761fe039b6d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.677 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 02f58f6e-918c-4adc-af25-0761fe039b6d in datapath 57473ab8-82ff-44c6-9161-154974021c91 unbound from our chassis
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.678 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 57473ab8-82ff-44c6-9161-154974021c91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.679 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8a350441-9fab-485e-8eaa-96d6cafc5753]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.679 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 namespace which is not needed anymore
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.690 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000099.scope: Deactivated successfully.
Jan 22 22:45:43 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000099.scope: Consumed 7.938s CPU time.
Jan 22 22:45:43 compute-0 systemd-machined[154006]: Machine qemu-68-instance-00000099 terminated.
Jan 22 22:45:43 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [NOTICE]   (233779) : haproxy version is 2.8.14-c23fe91
Jan 22 22:45:43 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [NOTICE]   (233779) : path to executable is /usr/sbin/haproxy
Jan 22 22:45:43 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [WARNING]  (233779) : Exiting Master process...
Jan 22 22:45:43 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [WARNING]  (233779) : Exiting Master process...
Jan 22 22:45:43 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [ALERT]    (233779) : Current worker (233781) exited with code 143 (Terminated)
Jan 22 22:45:43 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[233775]: [WARNING]  (233779) : All workers exited. Exiting... (0)
Jan 22 22:45:43 compute-0 systemd[1]: libpod-5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71.scope: Deactivated successfully.
Jan 22 22:45:43 compute-0 podman[233855]: 2026-01-22 22:45:43.824359975 +0000 UTC m=+0.055238695 container died 5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 22:45:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71-userdata-shm.mount: Deactivated successfully.
Jan 22 22:45:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-676a9468e91354537e615dcd5f70d519637690662f31f7c2351e8e7a3a643e32-merged.mount: Deactivated successfully.
Jan 22 22:45:43 compute-0 podman[233855]: 2026-01-22 22:45:43.865992151 +0000 UTC m=+0.096870871 container cleanup 5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 22:45:43 compute-0 systemd[1]: libpod-conmon-5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71.scope: Deactivated successfully.
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.905 182729 INFO nova.virt.libvirt.driver [-] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Instance destroyed successfully.
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.905 182729 DEBUG nova.objects.instance [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid ae2d061e-c057-4279-86d8-9e40ae86e1ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.918 182729 DEBUG nova.virt.libvirt.vif [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:45:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1928743683',display_name='tempest-TestNetworkBasicOps-server-1928743683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1928743683',id=153,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHAfT6r2m5pigYuIhbFw3DjS1lDwvZgeGPaMSPx9kMovGmyIaN7XWnz+1OaCliYMuLJZlzfUj+OLNZYiyWN1LDjR/cVp40EOhzAWlyMclPPaMWHRUYAQ0qF8GyB5jOs6TA==',key_name='tempest-TestNetworkBasicOps-257162631',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:45:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-zhhnok0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:45:36Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=ae2d061e-c057-4279-86d8-9e40ae86e1ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.919 182729 DEBUG nova.network.os_vif_util [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.919 182729 DEBUG nova.network.os_vif_util [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.920 182729 DEBUG os_vif [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.921 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.921 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02f58f6e-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:43 compute-0 podman[233896]: 2026-01-22 22:45:43.921875822 +0000 UTC m=+0.036691824 container remove 5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.922 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.924 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.926 182729 INFO os_vif [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91')
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.926 182729 INFO nova.virt.libvirt.driver [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Deleting instance files /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef_del
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.927 182729 INFO nova.virt.libvirt.driver [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Deletion of /var/lib/nova/instances/ae2d061e-c057-4279-86d8-9e40ae86e1ef_del complete
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.928 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7bade5-f44f-4c2d-b1d4-2ed1fff3ee28]: (4, ('Thu Jan 22 10:45:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 (5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71)\n5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71\nThu Jan 22 10:45:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 (5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71)\n5ab92eb377652f8ff7a4e425b10798f1af349c48ce549758d5bbcc7557ecdf71\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.929 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d77bfb1e-3a30-48d8-b5f7-8dbae6c2499b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.930 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57473ab8-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.931 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 kernel: tap57473ab8-80: left promiscuous mode
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.942 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.945 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[781dc911-0a05-4f7e-a66b-48c5269607d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.968 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b89026c1-824f-4d97-8d0d-74e4a81d12af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.969 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c8089801-fe31-411d-934a-d5cf0cc56f28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.982 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[48941b06-7771-49b7-9ea3-039411251cba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557336, 'reachable_time': 22454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233916, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d57473ab8\x2d82ff\x2d44c6\x2d9161\x2d154974021c91.mount: Deactivated successfully.
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.986 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:45:43 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:43.986 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9c28f4-db54-466e-be0c-cd017e2db05d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.994 182729 INFO nova.compute.manager [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.994 182729 DEBUG oslo.service.loopingcall [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.995 182729 DEBUG nova.compute.manager [-] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:45:43 compute-0 nova_compute[182725]: 2026-01-22 22:45:43.995 182729 DEBUG nova.network.neutron [-] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.008 182729 DEBUG nova.compute.manager [req-54c4a6f0-1350-49c9-a5e9-1cc0bf3c74d8 req-adffa222-2ec5-4eae-a5ea-6c000b248f25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received event network-vif-unplugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.008 182729 DEBUG oslo_concurrency.lockutils [req-54c4a6f0-1350-49c9-a5e9-1cc0bf3c74d8 req-adffa222-2ec5-4eae-a5ea-6c000b248f25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.008 182729 DEBUG oslo_concurrency.lockutils [req-54c4a6f0-1350-49c9-a5e9-1cc0bf3c74d8 req-adffa222-2ec5-4eae-a5ea-6c000b248f25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.009 182729 DEBUG oslo_concurrency.lockutils [req-54c4a6f0-1350-49c9-a5e9-1cc0bf3c74d8 req-adffa222-2ec5-4eae-a5ea-6c000b248f25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.009 182729 DEBUG nova.compute.manager [req-54c4a6f0-1350-49c9-a5e9-1cc0bf3c74d8 req-adffa222-2ec5-4eae-a5ea-6c000b248f25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] No waiting events found dispatching network-vif-unplugged-02f58f6e-918c-4adc-af25-0761fe039b6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.009 182729 DEBUG nova.compute.manager [req-54c4a6f0-1350-49c9-a5e9-1cc0bf3c74d8 req-adffa222-2ec5-4eae-a5ea-6c000b248f25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received event network-vif-unplugged-02f58f6e-918c-4adc-af25-0761fe039b6d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.344 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.778 182729 DEBUG nova.network.neutron [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Updated VIF entry in instance network info cache for port 02f58f6e-918c-4adc-af25-0761fe039b6d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.779 182729 DEBUG nova.network.neutron [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Updating instance_info_cache with network_info: [{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:45:44 compute-0 nova_compute[182725]: 2026-01-22 22:45:44.801 182729 DEBUG oslo_concurrency.lockutils [req-355853a1-f354-4120-84fb-29bc1ec3e7eb req-c547abc3-3528-47e7-b24a-b6886ac177e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-ae2d061e-c057-4279-86d8-9e40ae86e1ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.631 182729 DEBUG nova.network.neutron [-] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.649 182729 INFO nova.compute.manager [-] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Took 1.65 seconds to deallocate network for instance.
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.724 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.725 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.806 182729 DEBUG nova.compute.provider_tree [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.828 182729 DEBUG nova.scheduler.client.report [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.854 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.874 182729 INFO nova.scheduler.client.report [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance ae2d061e-c057-4279-86d8-9e40ae86e1ef
Jan 22 22:45:45 compute-0 nova_compute[182725]: 2026-01-22 22:45:45.943 182729 DEBUG oslo_concurrency.lockutils [None req-6b5f3976-bed9-46c5-af81-a9dbb6dfb499 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:46 compute-0 nova_compute[182725]: 2026-01-22 22:45:46.092 182729 DEBUG nova.compute.manager [req-f718be57-7985-48f1-96f1-3d6ae4d2ed45 req-ef5f7692-251c-45ef-a0a6-0c56b77e85e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:46 compute-0 nova_compute[182725]: 2026-01-22 22:45:46.093 182729 DEBUG oslo_concurrency.lockutils [req-f718be57-7985-48f1-96f1-3d6ae4d2ed45 req-ef5f7692-251c-45ef-a0a6-0c56b77e85e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:46 compute-0 nova_compute[182725]: 2026-01-22 22:45:46.093 182729 DEBUG oslo_concurrency.lockutils [req-f718be57-7985-48f1-96f1-3d6ae4d2ed45 req-ef5f7692-251c-45ef-a0a6-0c56b77e85e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:46 compute-0 nova_compute[182725]: 2026-01-22 22:45:46.093 182729 DEBUG oslo_concurrency.lockutils [req-f718be57-7985-48f1-96f1-3d6ae4d2ed45 req-ef5f7692-251c-45ef-a0a6-0c56b77e85e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ae2d061e-c057-4279-86d8-9e40ae86e1ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:46 compute-0 nova_compute[182725]: 2026-01-22 22:45:46.094 182729 DEBUG nova.compute.manager [req-f718be57-7985-48f1-96f1-3d6ae4d2ed45 req-ef5f7692-251c-45ef-a0a6-0c56b77e85e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] No waiting events found dispatching network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:45:46 compute-0 nova_compute[182725]: 2026-01-22 22:45:46.094 182729 WARNING nova.compute.manager [req-f718be57-7985-48f1-96f1-3d6ae4d2ed45 req-ef5f7692-251c-45ef-a0a6-0c56b77e85e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Received unexpected event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d for instance with vm_state deleted and task_state None.
Jan 22 22:45:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:48.015 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:45:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:48.015 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:45:48 compute-0 nova_compute[182725]: 2026-01-22 22:45:48.050 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:48 compute-0 nova_compute[182725]: 2026-01-22 22:45:48.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:45:48 compute-0 nova_compute[182725]: 2026-01-22 22:45:48.922 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:49 compute-0 nova_compute[182725]: 2026-01-22 22:45:49.346 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:50.017 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:51 compute-0 podman[233919]: 2026-01-22 22:45:51.14106842 +0000 UTC m=+0.062558527 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:45:51 compute-0 podman[233917]: 2026-01-22 22:45:51.146722541 +0000 UTC m=+0.073889680 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:45:51 compute-0 podman[233918]: 2026-01-22 22:45:51.176561873 +0000 UTC m=+0.093253921 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:45:53 compute-0 nova_compute[182725]: 2026-01-22 22:45:53.925 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.087 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.087 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.117 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.236 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.237 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.243 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.243 182729 INFO nova.compute.claims [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.348 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.405 182729 DEBUG nova.compute.provider_tree [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.422 182729 DEBUG nova.scheduler.client.report [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.441 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.442 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.490 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.491 182729 DEBUG nova.network.neutron [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.534 182729 INFO nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.577 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.903 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.905 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.905 182729 INFO nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Creating image(s)
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.906 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.906 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.907 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:54 compute-0 nova_compute[182725]: 2026-01-22 22:45:54.921 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.013 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.014 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.015 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.032 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.096 182729 DEBUG nova.policy [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.104 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.105 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.157 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.158 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.159 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.220 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.221 182729 DEBUG nova.virt.disk.api [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.222 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.292 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.293 182729 DEBUG nova.virt.disk.api [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.294 182729 DEBUG nova.objects.instance [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid a5d23dff-3c57-4220-b086-cde557bfedc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.319 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.321 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Ensure instance console log exists: /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.321 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.322 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:55 compute-0 nova_compute[182725]: 2026-01-22 22:45:55.322 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:56 compute-0 nova_compute[182725]: 2026-01-22 22:45:56.182 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:56 compute-0 nova_compute[182725]: 2026-01-22 22:45:56.421 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:56 compute-0 nova_compute[182725]: 2026-01-22 22:45:56.461 182729 DEBUG nova.network.neutron [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Successfully updated port: 02f58f6e-918c-4adc-af25-0761fe039b6d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:45:56 compute-0 nova_compute[182725]: 2026-01-22 22:45:56.480 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-a5d23dff-3c57-4220-b086-cde557bfedc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:45:56 compute-0 nova_compute[182725]: 2026-01-22 22:45:56.481 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-a5d23dff-3c57-4220-b086-cde557bfedc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:45:56 compute-0 nova_compute[182725]: 2026-01-22 22:45:56.482 182729 DEBUG nova.network.neutron [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:45:56 compute-0 nova_compute[182725]: 2026-01-22 22:45:56.645 182729 DEBUG nova.network.neutron [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:45:57 compute-0 nova_compute[182725]: 2026-01-22 22:45:57.092 182729 DEBUG nova.compute.manager [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received event network-changed-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:57 compute-0 nova_compute[182725]: 2026-01-22 22:45:57.093 182729 DEBUG nova.compute.manager [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Refreshing instance network info cache due to event network-changed-02f58f6e-918c-4adc-af25-0761fe039b6d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:45:57 compute-0 nova_compute[182725]: 2026-01-22 22:45:57.093 182729 DEBUG oslo_concurrency.lockutils [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a5d23dff-3c57-4220-b086-cde557bfedc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.050 182729 DEBUG nova.network.neutron [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Updating instance_info_cache with network_info: [{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.070 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-a5d23dff-3c57-4220-b086-cde557bfedc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.070 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Instance network_info: |[{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.071 182729 DEBUG oslo_concurrency.lockutils [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a5d23dff-3c57-4220-b086-cde557bfedc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.071 182729 DEBUG nova.network.neutron [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Refreshing network info cache for port 02f58f6e-918c-4adc-af25-0761fe039b6d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.077 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Start _get_guest_xml network_info=[{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.084 182729 WARNING nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.088 182729 DEBUG nova.virt.libvirt.host [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.089 182729 DEBUG nova.virt.libvirt.host [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.100 182729 DEBUG nova.virt.libvirt.host [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.101 182729 DEBUG nova.virt.libvirt.host [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.103 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.104 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.105 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.105 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.106 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.106 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.107 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.107 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.108 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.108 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.109 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.109 182729 DEBUG nova.virt.hardware [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.117 182729 DEBUG nova.virt.libvirt.vif [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1264621857',display_name='tempest-TestNetworkBasicOps-server-1264621857',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1264621857',id=155,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL8HVaMwijqw7UVNY2qkT6BgdSKDoB2f7ifVMaeOHV0OXbyzxW+6gVnW+u4K3+cegoj/NJAAfhOYUVNwniyfHI2YC4rnV/O04VRZPoVUVVMkWou3jvw8D9n07Md6d5jXRw==',key_name='tempest-TestNetworkBasicOps-2031362568',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-ns5sizss',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:54Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=a5d23dff-3c57-4220-b086-cde557bfedc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.117 182729 DEBUG nova.network.os_vif_util [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.119 182729 DEBUG nova.network.os_vif_util [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.121 182729 DEBUG nova.objects.instance [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid a5d23dff-3c57-4220-b086-cde557bfedc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.136 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <uuid>a5d23dff-3c57-4220-b086-cde557bfedc2</uuid>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <name>instance-0000009b</name>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <nova:name>tempest-TestNetworkBasicOps-server-1264621857</nova:name>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:45:58</nova:creationTime>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         <nova:port uuid="02f58f6e-918c-4adc-af25-0761fe039b6d">
Jan 22 22:45:58 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <system>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <entry name="serial">a5d23dff-3c57-4220-b086-cde557bfedc2</entry>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <entry name="uuid">a5d23dff-3c57-4220-b086-cde557bfedc2</entry>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </system>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <os>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   </os>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <features>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   </features>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk.config"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:2c:b6:a6"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <target dev="tap02f58f6e-91"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/console.log" append="off"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <video>
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </video>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:45:58 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:45:58 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:45:58 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:45:58 compute-0 nova_compute[182725]: </domain>
Jan 22 22:45:58 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.138 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Preparing to wait for external event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.139 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.139 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.140 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.141 182729 DEBUG nova.virt.libvirt.vif [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1264621857',display_name='tempest-TestNetworkBasicOps-server-1264621857',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1264621857',id=155,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL8HVaMwijqw7UVNY2qkT6BgdSKDoB2f7ifVMaeOHV0OXbyzxW+6gVnW+u4K3+cegoj/NJAAfhOYUVNwniyfHI2YC4rnV/O04VRZPoVUVVMkWou3jvw8D9n07Md6d5jXRw==',key_name='tempest-TestNetworkBasicOps-2031362568',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-ns5sizss',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:54Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=a5d23dff-3c57-4220-b086-cde557bfedc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.142 182729 DEBUG nova.network.os_vif_util [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.143 182729 DEBUG nova.network.os_vif_util [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.144 182729 DEBUG os_vif [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.145 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.146 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.147 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.152 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.152 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02f58f6e-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.153 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02f58f6e-91, col_values=(('external_ids', {'iface-id': '02f58f6e-918c-4adc-af25-0761fe039b6d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:b6:a6', 'vm-uuid': 'a5d23dff-3c57-4220-b086-cde557bfedc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.155 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:58 compute-0 NetworkManager[54954]: <info>  [1769121958.1559] manager: (tap02f58f6e-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.158 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.162 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.163 182729 INFO os_vif [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91')
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.225 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.225 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.226 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:2c:b6:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.227 182729 INFO nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Using config drive
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.855 182729 INFO nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Creating config drive at /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk.config
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.860 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp41eyiq5p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.904 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121943.9034622, ae2d061e-c057-4279-86d8-9e40ae86e1ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.905 182729 INFO nova.compute.manager [-] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] VM Stopped (Lifecycle Event)
Jan 22 22:45:58 compute-0 nova_compute[182725]: 2026-01-22 22:45:58.936 182729 DEBUG nova.compute.manager [None req-951a2564-5780-4462-8c9d-5cd263d79044 - - - - - -] [instance: ae2d061e-c057-4279-86d8-9e40ae86e1ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.000 182729 DEBUG oslo_concurrency.processutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp41eyiq5p" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.0656] manager: (tap02f58f6e-91): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Jan 22 22:45:59 compute-0 kernel: tap02f58f6e-91: entered promiscuous mode
Jan 22 22:45:59 compute-0 ovn_controller[94850]: 2026-01-22T22:45:59Z|00624|binding|INFO|Claiming lport 02f58f6e-918c-4adc-af25-0761fe039b6d for this chassis.
Jan 22 22:45:59 compute-0 ovn_controller[94850]: 2026-01-22T22:45:59Z|00625|binding|INFO|02f58f6e-918c-4adc-af25-0761fe039b6d: Claiming fa:16:3e:2c:b6:a6 10.100.0.12
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.069 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.072 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.0794] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.077 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.0803] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.091 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:b6:a6 10.100.0.12'], port_security=['fa:16:3e:2c:b6:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a5d23dff-3c57-4220-b086-cde557bfedc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57473ab8-82ff-44c6-9161-154974021c91', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '7', 'neutron:security_group_ids': '28858496-9de7-4d51-8064-4ba52669cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d879b131-bf41-479d-8ea2-01de2458b7e4, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=02f58f6e-918c-4adc-af25-0761fe039b6d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.094 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 02f58f6e-918c-4adc-af25-0761fe039b6d in datapath 57473ab8-82ff-44c6-9161-154974021c91 bound to our chassis
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.096 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57473ab8-82ff-44c6-9161-154974021c91
Jan 22 22:45:59 compute-0 systemd-udevd[234011]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.113 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[48a40ad3-bd3d-4f05-b3f2-7f040ade6842]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.114 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap57473ab8-81 in ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.117 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap57473ab8-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.117 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[13c19a73-d53d-42da-a51a-708de3f9d827]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.118 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7aea6e3c-fc4a-4a79-96ed-27787095781c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.1283] device (tap02f58f6e-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.1291] device (tap02f58f6e-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:45:59 compute-0 systemd-machined[154006]: New machine qemu-69-instance-0000009b.
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.133 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[db948641-844c-4c70-9252-2756d55c043e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.163 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[89203b65-0ede-43f6-860e-0953b7b4e9dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-0000009b.
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.200 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[7fae7a8f-510f-4148-a8cd-ed6add682e7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.2223] manager: (tap57473ab8-80): new Veth device (/org/freedesktop/NetworkManager/Devices/286)
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.223 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b15308af-34d9-4df0-900e-9daf10f77cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 systemd-udevd[234018]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.234 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.255 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 ovn_controller[94850]: 2026-01-22T22:45:59Z|00626|binding|INFO|Setting lport 02f58f6e-918c-4adc-af25-0761fe039b6d ovn-installed in OVS
Jan 22 22:45:59 compute-0 ovn_controller[94850]: 2026-01-22T22:45:59Z|00627|binding|INFO|Setting lport 02f58f6e-918c-4adc-af25-0761fe039b6d up in Southbound
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.264 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.276 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b50f4b0b-d58c-4ab2-99e9-a48c5b1a1c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.280 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f73921d9-4411-4866-9b7c-ba8c0c78c267]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.3135] device (tap57473ab8-80): carrier: link connected
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.322 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd4c257-9918-4b96-aaee-5be3d41a3688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.345 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c5dbc970-a697-4f95-8df2-9a5ed144fd24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57473ab8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:26:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559691, 'reachable_time': 33142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234047, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.351 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.362 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[01a05065-aa1a-49dd-b1df-e39bff0a816b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe95:263c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559691, 'tstamp': 559691}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234048, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.377 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bd45dea6-0825-4c5a-bbb3-9fcb56b4f695]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57473ab8-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:95:26:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559691, 'reachable_time': 33142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234049, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.408 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9972a9-2537-460a-af7c-27a62766b2fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.470 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ad03acab-4604-49d3-be6c-f9288674240f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.471 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57473ab8-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.472 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.472 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57473ab8-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:59 compute-0 kernel: tap57473ab8-80: entered promiscuous mode
Jan 22 22:45:59 compute-0 NetworkManager[54954]: <info>  [1769121959.4748] manager: (tap57473ab8-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.475 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.477 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57473ab8-80, col_values=(('external_ids', {'iface-id': '70e143e4-e907-4d6c-9423-8967adb571db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.478 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 ovn_controller[94850]: 2026-01-22T22:45:59Z|00628|binding|INFO|Releasing lport 70e143e4-e907-4d6c-9423-8967adb571db from this chassis (sb_readonly=0)
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.489 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.490 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.490 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/57473ab8-82ff-44c6-9161-154974021c91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/57473ab8-82ff-44c6-9161-154974021c91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.492 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2c8665-fe06-4164-916b-06a27f74f9b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.493 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-57473ab8-82ff-44c6-9161-154974021c91
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/57473ab8-82ff-44c6-9161-154974021c91.pid.haproxy
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.494 182729 DEBUG nova.compute.manager [req-3bea6bf6-633a-421f-99b6-019e0df24cab req-8b1971dc-49ac-46c0-bd65-b64181c7107c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 57473ab8-82ff-44c6-9161-154974021c91
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:45:59 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:45:59.493 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'env', 'PROCESS_TAG=haproxy-57473ab8-82ff-44c6-9161-154974021c91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/57473ab8-82ff-44c6-9161-154974021c91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.494 182729 DEBUG oslo_concurrency.lockutils [req-3bea6bf6-633a-421f-99b6-019e0df24cab req-8b1971dc-49ac-46c0-bd65-b64181c7107c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.494 182729 DEBUG oslo_concurrency.lockutils [req-3bea6bf6-633a-421f-99b6-019e0df24cab req-8b1971dc-49ac-46c0-bd65-b64181c7107c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.495 182729 DEBUG oslo_concurrency.lockutils [req-3bea6bf6-633a-421f-99b6-019e0df24cab req-8b1971dc-49ac-46c0-bd65-b64181c7107c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.495 182729 DEBUG nova.compute.manager [req-3bea6bf6-633a-421f-99b6-019e0df24cab req-8b1971dc-49ac-46c0-bd65-b64181c7107c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Processing event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.497 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.498 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121959.4979072, a5d23dff-3c57-4220-b086-cde557bfedc2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.498 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] VM Started (Lifecycle Event)
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.504 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.509 182729 INFO nova.virt.libvirt.driver [-] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Instance spawned successfully.
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.510 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.528 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.530 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.530 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.531 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.531 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.532 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.532 182729 DEBUG nova.virt.libvirt.driver [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.537 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.559 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.560 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121959.499977, a5d23dff-3c57-4220-b086-cde557bfedc2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.560 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] VM Paused (Lifecycle Event)
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.582 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.587 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769121959.504955, a5d23dff-3c57-4220-b086-cde557bfedc2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.588 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] VM Resumed (Lifecycle Event)
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.605 182729 INFO nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Took 4.70 seconds to spawn the instance on the hypervisor.
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.606 182729 DEBUG nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.610 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.617 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.639 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.702 182729 INFO nova.compute.manager [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Took 5.51 seconds to build instance.
Jan 22 22:45:59 compute-0 nova_compute[182725]: 2026-01-22 22:45:59.730 182729 DEBUG oslo_concurrency.lockutils [None req-8633aaa6-7d23-43b8-be5e-517a165dfbd7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:45:59 compute-0 podman[234088]: 2026-01-22 22:45:59.93942233 +0000 UTC m=+0.079032227 container create f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 22:45:59 compute-0 systemd[1]: Started libpod-conmon-f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571.scope.
Jan 22 22:45:59 compute-0 podman[234088]: 2026-01-22 22:45:59.898114662 +0000 UTC m=+0.037724569 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:46:00 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:46:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7e33415a2a5efa6c7e781fa7ee5c6905c9265ab0a8109c77601c6dd58a41f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:46:00 compute-0 podman[234088]: 2026-01-22 22:46:00.042581957 +0000 UTC m=+0.182191864 container init f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 22:46:00 compute-0 podman[234088]: 2026-01-22 22:46:00.04954312 +0000 UTC m=+0.189153017 container start f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:46:00 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[234103]: [NOTICE]   (234107) : New worker (234109) forked
Jan 22 22:46:00 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[234103]: [NOTICE]   (234107) : Loading success.
Jan 22 22:46:00 compute-0 nova_compute[182725]: 2026-01-22 22:46:00.411 182729 DEBUG nova.network.neutron [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Updated VIF entry in instance network info cache for port 02f58f6e-918c-4adc-af25-0761fe039b6d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:46:00 compute-0 nova_compute[182725]: 2026-01-22 22:46:00.412 182729 DEBUG nova.network.neutron [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Updating instance_info_cache with network_info: [{"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:46:00 compute-0 nova_compute[182725]: 2026-01-22 22:46:00.429 182729 DEBUG oslo_concurrency.lockutils [req-ac95a8ed-2585-47da-ba92-1d9b5a94d704 req-6f6f73b1-6be1-4fe7-b766-80cf42634b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a5d23dff-3c57-4220-b086-cde557bfedc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.285 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.286 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.287 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.288 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.289 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.305 182729 INFO nova.compute.manager [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Terminating instance
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.320 182729 DEBUG nova.compute.manager [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:46:01 compute-0 kernel: tap02f58f6e-91 (unregistering): left promiscuous mode
Jan 22 22:46:01 compute-0 NetworkManager[54954]: <info>  [1769121961.3484] device (tap02f58f6e-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.351 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 ovn_controller[94850]: 2026-01-22T22:46:01Z|00629|binding|INFO|Releasing lport 02f58f6e-918c-4adc-af25-0761fe039b6d from this chassis (sb_readonly=0)
Jan 22 22:46:01 compute-0 ovn_controller[94850]: 2026-01-22T22:46:01Z|00630|binding|INFO|Setting lport 02f58f6e-918c-4adc-af25-0761fe039b6d down in Southbound
Jan 22 22:46:01 compute-0 ovn_controller[94850]: 2026-01-22T22:46:01Z|00631|binding|INFO|Removing iface tap02f58f6e-91 ovn-installed in OVS
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.354 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.364 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:b6:a6 10.100.0.12'], port_security=['fa:16:3e:2c:b6:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a5d23dff-3c57-4220-b086-cde557bfedc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57473ab8-82ff-44c6-9161-154974021c91', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-191709835', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '9', 'neutron:security_group_ids': '28858496-9de7-4d51-8064-4ba52669cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.200', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d879b131-bf41-479d-8ea2-01de2458b7e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=02f58f6e-918c-4adc-af25-0761fe039b6d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.367 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 02f58f6e-918c-4adc-af25-0761fe039b6d in datapath 57473ab8-82ff-44c6-9161-154974021c91 unbound from our chassis
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.370 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 57473ab8-82ff-44c6-9161-154974021c91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.372 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[512bf64c-4a03-4390-9d96-c72b0a3874ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.373 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 namespace which is not needed anymore
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.376 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 22 22:46:01 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000009b.scope: Consumed 2.125s CPU time.
Jan 22 22:46:01 compute-0 systemd-machined[154006]: Machine qemu-69-instance-0000009b terminated.
Jan 22 22:46:01 compute-0 ovn_controller[94850]: 2026-01-22T22:46:01Z|00632|binding|INFO|Releasing lport 70e143e4-e907-4d6c-9423-8967adb571db from this chassis (sb_readonly=0)
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.511 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[234103]: [NOTICE]   (234107) : haproxy version is 2.8.14-c23fe91
Jan 22 22:46:01 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[234103]: [NOTICE]   (234107) : path to executable is /usr/sbin/haproxy
Jan 22 22:46:01 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[234103]: [WARNING]  (234107) : Exiting Master process...
Jan 22 22:46:01 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[234103]: [ALERT]    (234107) : Current worker (234109) exited with code 143 (Terminated)
Jan 22 22:46:01 compute-0 neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91[234103]: [WARNING]  (234107) : All workers exited. Exiting... (0)
Jan 22 22:46:01 compute-0 systemd[1]: libpod-f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571.scope: Deactivated successfully.
Jan 22 22:46:01 compute-0 podman[234140]: 2026-01-22 22:46:01.53899406 +0000 UTC m=+0.069466969 container died f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:46:01 compute-0 NetworkManager[54954]: <info>  [1769121961.5956] manager: (tap02f58f6e-91): new Tun device (/org/freedesktop/NetworkManager/Devices/288)
Jan 22 22:46:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571-userdata-shm.mount: Deactivated successfully.
Jan 22 22:46:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-ab7e33415a2a5efa6c7e781fa7ee5c6905c9265ab0a8109c77601c6dd58a41f2-merged.mount: Deactivated successfully.
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.622 182729 DEBUG nova.compute.manager [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.622 182729 DEBUG oslo_concurrency.lockutils [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.622 182729 DEBUG oslo_concurrency.lockutils [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.623 182729 DEBUG oslo_concurrency.lockutils [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.623 182729 DEBUG nova.compute.manager [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] No waiting events found dispatching network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.623 182729 WARNING nova.compute.manager [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received unexpected event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d for instance with vm_state active and task_state deleting.
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.623 182729 DEBUG nova.compute.manager [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received event network-vif-unplugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.623 182729 DEBUG oslo_concurrency.lockutils [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.624 182729 DEBUG oslo_concurrency.lockutils [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.624 182729 DEBUG oslo_concurrency.lockutils [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.624 182729 DEBUG nova.compute.manager [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] No waiting events found dispatching network-vif-unplugged-02f58f6e-918c-4adc-af25-0761fe039b6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.624 182729 DEBUG nova.compute.manager [req-e83ee25f-57cf-4791-908b-8a35a281041a req-74c1bb1d-6437-429a-b780-6c2c63e28492 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received event network-vif-unplugged-02f58f6e-918c-4adc-af25-0761fe039b6d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:46:01 compute-0 podman[234140]: 2026-01-22 22:46:01.630454676 +0000 UTC m=+0.160927595 container cleanup f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:46:01 compute-0 systemd[1]: libpod-conmon-f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571.scope: Deactivated successfully.
Jan 22 22:46:01 compute-0 ovn_controller[94850]: 2026-01-22T22:46:01Z|00633|binding|INFO|Releasing lport 70e143e4-e907-4d6c-9423-8967adb571db from this chassis (sb_readonly=0)
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.684 182729 INFO nova.virt.libvirt.driver [-] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Instance destroyed successfully.
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.685 182729 DEBUG nova.objects.instance [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid a5d23dff-3c57-4220-b086-cde557bfedc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.687 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.700 182729 DEBUG nova.virt.libvirt.vif [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:45:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1264621857',display_name='tempest-TestNetworkBasicOps-server-1264621857',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1264621857',id=155,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL8HVaMwijqw7UVNY2qkT6BgdSKDoB2f7ifVMaeOHV0OXbyzxW+6gVnW+u4K3+cegoj/NJAAfhOYUVNwniyfHI2YC4rnV/O04VRZPoVUVVMkWou3jvw8D9n07Md6d5jXRw==',key_name='tempest-TestNetworkBasicOps-2031362568',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:45:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-ns5sizss',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:45:59Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=a5d23dff-3c57-4220-b086-cde557bfedc2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.701 182729 DEBUG nova.network.os_vif_util [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "02f58f6e-918c-4adc-af25-0761fe039b6d", "address": "fa:16:3e:2c:b6:a6", "network": {"id": "57473ab8-82ff-44c6-9161-154974021c91", "bridge": "br-int", "label": "tempest-network-smoke--1848032854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f58f6e-91", "ovs_interfaceid": "02f58f6e-918c-4adc-af25-0761fe039b6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.701 182729 DEBUG nova.network.os_vif_util [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.702 182729 DEBUG os_vif [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.703 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.703 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02f58f6e-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.706 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.708 182729 INFO os_vif [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:b6:a6,bridge_name='br-int',has_traffic_filtering=True,id=02f58f6e-918c-4adc-af25-0761fe039b6d,network=Network(57473ab8-82ff-44c6-9161-154974021c91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap02f58f6e-91')
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.708 182729 INFO nova.virt.libvirt.driver [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Deleting instance files /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2_del
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.709 182729 INFO nova.virt.libvirt.driver [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Deletion of /var/lib/nova/instances/a5d23dff-3c57-4220-b086-cde557bfedc2_del complete
Jan 22 22:46:01 compute-0 podman[234178]: 2026-01-22 22:46:01.722735112 +0000 UTC m=+0.055591534 container remove f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.729 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[02ba82b0-43c9-4b2e-85f4-048471253f16]: (4, ('Thu Jan 22 10:46:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 (f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571)\nf7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571\nThu Jan 22 10:46:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 (f7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571)\nf7e8b709336f5b76f49b3a7926564234a68a7732ed5cd804d7ba9c66ddc88571\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.731 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[84aa8422-f6be-4659-9fa8-95ce4a4e3d0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.732 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57473ab8-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:01 compute-0 kernel: tap57473ab8-80: left promiscuous mode
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.735 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.749 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.753 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b01ddfac-fa66-4e29-971e-c654c356c4b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.774 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0f90cc81-98c1-4176-980a-cfc07b806e05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.775 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[888ed667-bfa6-4939-9d65-d7163a00d1ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.799 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f45ce4-b9c3-4720-a226-54edc927c07a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559679, 'reachable_time': 23377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234199, 'error': None, 'target': 'ovnmeta-57473ab8-82ff-44c6-9161-154974021c91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d57473ab8\x2d82ff\x2d44c6\x2d9161\x2d154974021c91.mount: Deactivated successfully.
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.804 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-57473ab8-82ff-44c6-9161-154974021c91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:46:01 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:01.804 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[80c34a2a-f2c6-483c-8b24-f89bf6858803]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.807 182729 INFO nova.compute.manager [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Took 0.49 seconds to destroy the instance on the hypervisor.
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.808 182729 DEBUG oslo.service.loopingcall [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.809 182729 DEBUG nova.compute.manager [-] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:46:01 compute-0 nova_compute[182725]: 2026-01-22 22:46:01.809 182729 DEBUG nova.network.neutron [-] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.391 182729 DEBUG nova.network.neutron [-] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.413 182729 INFO nova.compute.manager [-] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Took 1.60 seconds to deallocate network for instance.
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.482 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.483 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.545 182729 DEBUG nova.compute.provider_tree [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.562 182729 DEBUG nova.scheduler.client.report [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.584 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.604 182729 INFO nova.scheduler.client.report [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance a5d23dff-3c57-4220-b086-cde557bfedc2
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.683 182729 DEBUG oslo_concurrency.lockutils [None req-9d9e3d88-cfce-4fa8-baa1-df924879e5f5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.878 182729 DEBUG nova.compute.manager [req-89456a2b-a7f6-45a7-b439-bf5e7e9f7316 req-5bfda73b-c11e-4a37-9f76-b47ff5528d68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.879 182729 DEBUG oslo_concurrency.lockutils [req-89456a2b-a7f6-45a7-b439-bf5e7e9f7316 req-5bfda73b-c11e-4a37-9f76-b47ff5528d68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.879 182729 DEBUG oslo_concurrency.lockutils [req-89456a2b-a7f6-45a7-b439-bf5e7e9f7316 req-5bfda73b-c11e-4a37-9f76-b47ff5528d68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.880 182729 DEBUG oslo_concurrency.lockutils [req-89456a2b-a7f6-45a7-b439-bf5e7e9f7316 req-5bfda73b-c11e-4a37-9f76-b47ff5528d68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a5d23dff-3c57-4220-b086-cde557bfedc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.880 182729 DEBUG nova.compute.manager [req-89456a2b-a7f6-45a7-b439-bf5e7e9f7316 req-5bfda73b-c11e-4a37-9f76-b47ff5528d68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] No waiting events found dispatching network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:46:03 compute-0 nova_compute[182725]: 2026-01-22 22:46:03.880 182729 WARNING nova.compute.manager [req-89456a2b-a7f6-45a7-b439-bf5e7e9f7316 req-5bfda73b-c11e-4a37-9f76-b47ff5528d68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Received unexpected event network-vif-plugged-02f58f6e-918c-4adc-af25-0761fe039b6d for instance with vm_state deleted and task_state None.
Jan 22 22:46:04 compute-0 nova_compute[182725]: 2026-01-22 22:46:04.353 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:06 compute-0 podman[234201]: 2026-01-22 22:46:06.218872466 +0000 UTC m=+0.140902847 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 22:46:06 compute-0 nova_compute[182725]: 2026-01-22 22:46:06.705 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:46:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:46:09 compute-0 nova_compute[182725]: 2026-01-22 22:46:09.354 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:11 compute-0 nova_compute[182725]: 2026-01-22 22:46:11.737 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:12 compute-0 podman[234221]: 2026-01-22 22:46:12.184558484 +0000 UTC m=+0.100541033 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:46:12 compute-0 podman[234222]: 2026-01-22 22:46:12.194002079 +0000 UTC m=+0.103891186 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal)
Jan 22 22:46:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:12.453 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:12.454 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:12.455 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:14 compute-0 nova_compute[182725]: 2026-01-22 22:46:14.356 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:16 compute-0 nova_compute[182725]: 2026-01-22 22:46:16.683 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121961.681621, a5d23dff-3c57-4220-b086-cde557bfedc2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:46:16 compute-0 nova_compute[182725]: 2026-01-22 22:46:16.683 182729 INFO nova.compute.manager [-] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] VM Stopped (Lifecycle Event)
Jan 22 22:46:16 compute-0 nova_compute[182725]: 2026-01-22 22:46:16.739 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:17 compute-0 nova_compute[182725]: 2026-01-22 22:46:17.724 182729 DEBUG nova.compute.manager [None req-ec33604c-53d1-4028-8853-03e709e592a0 - - - - - -] [instance: a5d23dff-3c57-4220-b086-cde557bfedc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:46:19 compute-0 nova_compute[182725]: 2026-01-22 22:46:19.357 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:21 compute-0 nova_compute[182725]: 2026-01-22 22:46:21.740 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:22 compute-0 podman[234270]: 2026-01-22 22:46:22.134745275 +0000 UTC m=+0.061265736 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:46:22 compute-0 podman[234269]: 2026-01-22 22:46:22.166660129 +0000 UTC m=+0.085961970 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:46:22 compute-0 podman[234271]: 2026-01-22 22:46:22.173054788 +0000 UTC m=+0.084536035 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:46:22 compute-0 nova_compute[182725]: 2026-01-22 22:46:22.900 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:24 compute-0 nova_compute[182725]: 2026-01-22 22:46:24.359 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:26 compute-0 nova_compute[182725]: 2026-01-22 22:46:26.791 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:29 compute-0 nova_compute[182725]: 2026-01-22 22:46:29.360 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:31 compute-0 nova_compute[182725]: 2026-01-22 22:46:31.793 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:32 compute-0 nova_compute[182725]: 2026-01-22 22:46:32.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:32 compute-0 nova_compute[182725]: 2026-01-22 22:46:32.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:46:32 compute-0 nova_compute[182725]: 2026-01-22 22:46:32.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:46:32 compute-0 nova_compute[182725]: 2026-01-22 22:46:32.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:46:33 compute-0 nova_compute[182725]: 2026-01-22 22:46:33.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:33 compute-0 nova_compute[182725]: 2026-01-22 22:46:33.942 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:33 compute-0 nova_compute[182725]: 2026-01-22 22:46:33.942 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:33 compute-0 nova_compute[182725]: 2026-01-22 22:46:33.942 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:33 compute-0 nova_compute[182725]: 2026-01-22 22:46:33.942 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.149 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.150 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.31696701049805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.150 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.151 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.264 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.264 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.293 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.341 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.341 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.361 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.380 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:46:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:34.385 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:57:9d 2001:db8:0:1:f816:3eff:fe60:579d 2001:db8::f816:3eff:fe60:579d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe60:579d/64 2001:db8::f816:3eff:fe60:579d/64', 'neutron:device_id': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f24457d1-1f42-46ad-bdaa-d087103c906a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f20a608b-4dde-4090-8331-5a96db0eeb25) old=Port_Binding(mac=['fa:16:3e:60:57:9d 2001:db8::f816:3eff:fe60:579d'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe60:579d/64', 'neutron:device_id': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:46:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:34.386 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f20a608b-4dde-4090-8331-5a96db0eeb25 in datapath 09b515c7-d044-43d4-b895-408eb5de1fd8 updated
Jan 22 22:46:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:34.387 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09b515c7-d044-43d4-b895-408eb5de1fd8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:46:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:34.388 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd2070c-e916-4c55-b0bf-a0ef848bad8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.409 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.435 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.451 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.468 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:46:34 compute-0 nova_compute[182725]: 2026-01-22 22:46:34.468 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:35 compute-0 nova_compute[182725]: 2026-01-22 22:46:35.468 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:35 compute-0 nova_compute[182725]: 2026-01-22 22:46:35.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:35 compute-0 nova_compute[182725]: 2026-01-22 22:46:35.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:36 compute-0 nova_compute[182725]: 2026-01-22 22:46:36.846 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:37 compute-0 podman[234336]: 2026-01-22 22:46:37.169229884 +0000 UTC m=+0.089632061 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:46:38 compute-0 nova_compute[182725]: 2026-01-22 22:46:38.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:39 compute-0 nova_compute[182725]: 2026-01-22 22:46:39.363 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:40 compute-0 nova_compute[182725]: 2026-01-22 22:46:40.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:40 compute-0 nova_compute[182725]: 2026-01-22 22:46:40.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:46:40 compute-0 nova_compute[182725]: 2026-01-22 22:46:40.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:46:41 compute-0 nova_compute[182725]: 2026-01-22 22:46:41.848 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:42 compute-0 ovn_controller[94850]: 2026-01-22T22:46:42Z|00634|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 22:46:43 compute-0 podman[234356]: 2026-01-22 22:46:43.713308502 +0000 UTC m=+0.080559385 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 22 22:46:43 compute-0 podman[234358]: 2026-01-22 22:46:43.74455294 +0000 UTC m=+0.093542389 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.000 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.000 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.016 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.153 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.153 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.164 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.164 182729 INFO nova.compute.claims [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.320 182729 DEBUG nova.compute.provider_tree [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.356 182729 DEBUG nova.scheduler.client.report [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.364 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.379 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.380 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.465 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.466 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.491 182729 INFO nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.511 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.644 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.646 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.647 182729 INFO nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Creating image(s)
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.648 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.648 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.649 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.675 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.772 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.773 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.774 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.789 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.844 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.845 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.879 182729 DEBUG nova.policy [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.895 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.896 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.897 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.983 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.985 182729 DEBUG nova.virt.disk.api [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:46:44 compute-0 nova_compute[182725]: 2026-01-22 22:46:44.986 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.071 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.073 182729 DEBUG nova.virt.disk.api [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.073 182729 DEBUG nova.objects.instance [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid e08a106b-5819-44ac-bcad-850c349c17cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.089 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.090 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Ensure instance console log exists: /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.091 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.091 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:45 compute-0 nova_compute[182725]: 2026-01-22 22:46:45.091 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:46 compute-0 nova_compute[182725]: 2026-01-22 22:46:46.775 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Successfully created port: 9ca80ea8-671c-4688-a4d3-26fc656e645e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:46:46 compute-0 nova_compute[182725]: 2026-01-22 22:46:46.850 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:47 compute-0 nova_compute[182725]: 2026-01-22 22:46:47.693 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Successfully created port: 355e0aa9-5b7b-417a-a2e2-dea353a114d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:46:48 compute-0 nova_compute[182725]: 2026-01-22 22:46:48.618 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Successfully updated port: 9ca80ea8-671c-4688-a4d3-26fc656e645e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.366 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.493 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Successfully updated port: 355e0aa9-5b7b-417a-a2e2-dea353a114d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.517 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.517 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.517 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.763 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.857 182729 DEBUG nova.compute.manager [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-changed-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.858 182729 DEBUG nova.compute.manager [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing instance network info cache due to event network-changed-9ca80ea8-671c-4688-a4d3-26fc656e645e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:46:49 compute-0 nova_compute[182725]: 2026-01-22 22:46:49.859 182729 DEBUG oslo_concurrency.lockutils [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:46:51 compute-0 nova_compute[182725]: 2026-01-22 22:46:51.852 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:53 compute-0 podman[234420]: 2026-01-22 22:46:53.133638868 +0000 UTC m=+0.063287546 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 22:46:53 compute-0 podman[234419]: 2026-01-22 22:46:53.146929979 +0000 UTC m=+0.070881555 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:46:53 compute-0 podman[234421]: 2026-01-22 22:46:53.171176132 +0000 UTC m=+0.089919048 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.141 182729 DEBUG nova.network.neutron [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updating instance_info_cache with network_info: [{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.398 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.408 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.408 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Instance network_info: |[{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.410 182729 DEBUG oslo_concurrency.lockutils [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.410 182729 DEBUG nova.network.neutron [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing network info cache for port 9ca80ea8-671c-4688-a4d3-26fc656e645e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.417 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Start _get_guest_xml network_info=[{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.424 182729 WARNING nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.433 182729 DEBUG nova.virt.libvirt.host [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.434 182729 DEBUG nova.virt.libvirt.host [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.444 182729 DEBUG nova.virt.libvirt.host [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.445 182729 DEBUG nova.virt.libvirt.host [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.447 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.448 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.449 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.449 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.450 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.451 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.451 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.452 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.453 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.453 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.454 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.454 182729 DEBUG nova.virt.hardware [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.462 182729 DEBUG nova.virt.libvirt.vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:46:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-668956998',display_name='tempest-TestGettingAddress-server-668956998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-668956998',id=158,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-p89v47g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:46:44Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=e08a106b-5819-44ac-bcad-850c349c17cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.463 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.465 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b4:50,bridge_name='br-int',has_traffic_filtering=True,id=9ca80ea8-671c-4688-a4d3-26fc656e645e,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ca80ea8-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.467 182729 DEBUG nova.virt.libvirt.vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:46:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-668956998',display_name='tempest-TestGettingAddress-server-668956998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-668956998',id=158,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-p89v47g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:46:44Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=e08a106b-5819-44ac-bcad-850c349c17cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.467 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.469 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:68:7d,bridge_name='br-int',has_traffic_filtering=True,id=355e0aa9-5b7b-417a-a2e2-dea353a114d0,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap355e0aa9-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.470 182729 DEBUG nova.objects.instance [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid e08a106b-5819-44ac-bcad-850c349c17cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.496 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <uuid>e08a106b-5819-44ac-bcad-850c349c17cf</uuid>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <name>instance-0000009e</name>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <nova:name>tempest-TestGettingAddress-server-668956998</nova:name>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:46:54</nova:creationTime>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:port uuid="9ca80ea8-671c-4688-a4d3-26fc656e645e">
Jan 22 22:46:54 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         <nova:port uuid="355e0aa9-5b7b-417a-a2e2-dea353a114d0">
Jan 22 22:46:54 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe56:687d" ipVersion="6"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe56:687d" ipVersion="6"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <system>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <entry name="serial">e08a106b-5819-44ac-bcad-850c349c17cf</entry>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <entry name="uuid">e08a106b-5819-44ac-bcad-850c349c17cf</entry>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </system>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <os>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   </os>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <features>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   </features>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk.config"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:e6:b4:50"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <target dev="tap9ca80ea8-67"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:56:68:7d"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <target dev="tap355e0aa9-5b"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/console.log" append="off"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <video>
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </video>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:46:54 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:46:54 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:46:54 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:46:54 compute-0 nova_compute[182725]: </domain>
Jan 22 22:46:54 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.498 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Preparing to wait for external event network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.498 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.498 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.499 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.499 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Preparing to wait for external event network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.499 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.499 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.499 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.500 182729 DEBUG nova.virt.libvirt.vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:46:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-668956998',display_name='tempest-TestGettingAddress-server-668956998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-668956998',id=158,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-p89v47g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:46:44Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=e08a106b-5819-44ac-bcad-850c349c17cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.500 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.500 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b4:50,bridge_name='br-int',has_traffic_filtering=True,id=9ca80ea8-671c-4688-a4d3-26fc656e645e,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ca80ea8-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.501 182729 DEBUG os_vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b4:50,bridge_name='br-int',has_traffic_filtering=True,id=9ca80ea8-671c-4688-a4d3-26fc656e645e,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ca80ea8-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.501 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.501 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.502 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.504 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.505 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ca80ea8-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.505 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ca80ea8-67, col_values=(('external_ids', {'iface-id': '9ca80ea8-671c-4688-a4d3-26fc656e645e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:b4:50', 'vm-uuid': 'e08a106b-5819-44ac-bcad-850c349c17cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.506 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.507 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.508 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:46:54 compute-0 NetworkManager[54954]: <info>  [1769122014.5091] manager: (tap9ca80ea8-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.516 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.518 182729 INFO os_vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b4:50,bridge_name='br-int',has_traffic_filtering=True,id=9ca80ea8-671c-4688-a4d3-26fc656e645e,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ca80ea8-67')
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.519 182729 DEBUG nova.virt.libvirt.vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:46:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-668956998',display_name='tempest-TestGettingAddress-server-668956998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-668956998',id=158,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-p89v47g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:46:44Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=e08a106b-5819-44ac-bcad-850c349c17cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.520 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.521 182729 DEBUG nova.network.os_vif_util [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:68:7d,bridge_name='br-int',has_traffic_filtering=True,id=355e0aa9-5b7b-417a-a2e2-dea353a114d0,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap355e0aa9-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.522 182729 DEBUG os_vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:68:7d,bridge_name='br-int',has_traffic_filtering=True,id=355e0aa9-5b7b-417a-a2e2-dea353a114d0,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap355e0aa9-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.523 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.523 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.524 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.526 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.527 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap355e0aa9-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.527 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap355e0aa9-5b, col_values=(('external_ids', {'iface-id': '355e0aa9-5b7b-417a-a2e2-dea353a114d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:68:7d', 'vm-uuid': 'e08a106b-5819-44ac-bcad-850c349c17cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.529 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 NetworkManager[54954]: <info>  [1769122014.5309] manager: (tap355e0aa9-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.533 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.539 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.541 182729 INFO os_vif [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:68:7d,bridge_name='br-int',has_traffic_filtering=True,id=355e0aa9-5b7b-417a-a2e2-dea353a114d0,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap355e0aa9-5b')
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.611 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.612 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.612 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:e6:b4:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.613 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:56:68:7d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:46:54 compute-0 nova_compute[182725]: 2026-01-22 22:46:54.614 182729 INFO nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Using config drive
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.188 182729 INFO nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Creating config drive at /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk.config
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.198 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjsfnygfa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.340 182729 DEBUG oslo_concurrency.processutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjsfnygfa" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.4330] manager: (tap9ca80ea8-67): new Tun device (/org/freedesktop/NetworkManager/Devices/291)
Jan 22 22:46:55 compute-0 kernel: tap9ca80ea8-67: entered promiscuous mode
Jan 22 22:46:55 compute-0 systemd-udevd[234507]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.479 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00635|binding|INFO|Claiming lport 9ca80ea8-671c-4688-a4d3-26fc656e645e for this chassis.
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00636|binding|INFO|9ca80ea8-671c-4688-a4d3-26fc656e645e: Claiming fa:16:3e:e6:b4:50 10.100.0.14
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.4897] manager: (tap355e0aa9-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Jan 22 22:46:55 compute-0 kernel: tap355e0aa9-5b: entered promiscuous mode
Jan 22 22:46:55 compute-0 systemd-udevd[234512]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.495 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00637|if_status|INFO|Not updating pb chassis for 355e0aa9-5b7b-417a-a2e2-dea353a114d0 now as sb is readonly
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.4988] device (tap9ca80ea8-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.5010] device (tap9ca80ea8-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.5071] device (tap355e0aa9-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.5090] device (tap355e0aa9-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00638|binding|INFO|Claiming lport 355e0aa9-5b7b-417a-a2e2-dea353a114d0 for this chassis.
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00639|binding|INFO|355e0aa9-5b7b-417a-a2e2-dea353a114d0: Claiming fa:16:3e:56:68:7d 2001:db8:0:1:f816:3eff:fe56:687d 2001:db8::f816:3eff:fe56:687d
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.517 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:b4:50 10.100.0.14'], port_security=['fa:16:3e:e6:b4:50 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e08a106b-5819-44ac-bcad-850c349c17cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55b79226-17c7-4623-9f19-8585aca1b119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=297b2520-2860-4872-a497-3a3478b0820d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9ca80ea8-671c-4688-a4d3-26fc656e645e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.519 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9ca80ea8-671c-4688-a4d3-26fc656e645e in datapath 55b79226-17c7-4623-9f19-8585aca1b119 bound to our chassis
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.521 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55b79226-17c7-4623-9f19-8585aca1b119
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.534 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[71570205-572d-4037-9cbb-67396ec091bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.537 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:68:7d 2001:db8:0:1:f816:3eff:fe56:687d 2001:db8::f816:3eff:fe56:687d'], port_security=['fa:16:3e:56:68:7d 2001:db8:0:1:f816:3eff:fe56:687d 2001:db8::f816:3eff:fe56:687d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe56:687d/64 2001:db8::f816:3eff:fe56:687d/64', 'neutron:device_id': 'e08a106b-5819-44ac-bcad-850c349c17cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f24457d1-1f42-46ad-bdaa-d087103c906a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=355e0aa9-5b7b-417a-a2e2-dea353a114d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.539 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55b79226-11 in ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.542 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55b79226-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.542 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2feaf4c2-7721-4bcc-85a3-a06492153f7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.543 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[60639175-44b5-4007-992d-73d55666d4ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.558 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a995a3-8551-41c2-b62f-e2335a8f763b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 systemd-machined[154006]: New machine qemu-70-instance-0000009e.
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.587 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.589 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[48dad025-4e5d-4b60-855e-8d7c53c51fe8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00640|binding|INFO|Setting lport 9ca80ea8-671c-4688-a4d3-26fc656e645e ovn-installed in OVS
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00641|binding|INFO|Setting lport 9ca80ea8-671c-4688-a4d3-26fc656e645e up in Southbound
Jan 22 22:46:55 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-0000009e.
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.596 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00642|binding|INFO|Setting lport 355e0aa9-5b7b-417a-a2e2-dea353a114d0 ovn-installed in OVS
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.609 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00643|binding|INFO|Setting lport 355e0aa9-5b7b-417a-a2e2-dea353a114d0 up in Southbound
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.628 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[adc66808-3121-40b7-ac0f-006251878a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.6381] manager: (tap55b79226-10): new Veth device (/org/freedesktop/NetworkManager/Devices/293)
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.635 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9b12b9-1f75-410b-8d6d-e5f37f1514c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.676 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[3da4b61c-ea64-453b-8eeb-86838c482d2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.681 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[0c735f58-0a7d-4583-8fce-9dae90393747]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.7104] device (tap55b79226-10): carrier: link connected
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.722 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[c176aa2e-2ba6-4015-9da6-a8b30bac9c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.746 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fc014d5d-2bc6-442b-a108-e997269bf75b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55b79226-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:9b:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565331, 'reachable_time': 22100, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234548, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.771 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ea39a81b-712c-432b-8547-484201e15689]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:9b3d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565331, 'tstamp': 565331}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234549, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.797 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c3327e81-3f71-4f7e-b352-ad4e0046bbc2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55b79226-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:9b:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565331, 'reachable_time': 22100, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234550, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.842 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a031adb6-afd3-4476-88ad-c77f26fd5cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.914 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[542078ea-9391-460a-bb18-87b57c100f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.916 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55b79226-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.916 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.916 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55b79226-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.918 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:55 compute-0 kernel: tap55b79226-10: entered promiscuous mode
Jan 22 22:46:55 compute-0 NetworkManager[54954]: <info>  [1769122015.9206] manager: (tap55b79226-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.922 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55b79226-10, col_values=(('external_ids', {'iface-id': '781e89a1-cc2a-4079-9a3d-fcfacb1013c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:55 compute-0 ovn_controller[94850]: 2026-01-22T22:46:55Z|00644|binding|INFO|Releasing lport 781e89a1-cc2a-4079-9a3d-fcfacb1013c1 from this chassis (sb_readonly=0)
Jan 22 22:46:55 compute-0 nova_compute[182725]: 2026-01-22 22:46:55.948 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.949 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55b79226-17c7-4623-9f19-8585aca1b119.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55b79226-17c7-4623-9f19-8585aca1b119.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.950 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94323233-e501-4e7d-80a2-900d270e85a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.952 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-55b79226-17c7-4623-9f19-8585aca1b119
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/55b79226-17c7-4623-9f19-8585aca1b119.pid.haproxy
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 55b79226-17c7-4623-9f19-8585aca1b119
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:46:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:55.954 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'env', 'PROCESS_TAG=haproxy-55b79226-17c7-4623-9f19-8585aca1b119', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55b79226-17c7-4623-9f19-8585aca1b119.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.005 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122016.003554, e08a106b-5819-44ac-bcad-850c349c17cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.005 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] VM Started (Lifecycle Event)
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.068 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.074 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122016.0037398, e08a106b-5819-44ac-bcad-850c349c17cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.074 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] VM Paused (Lifecycle Event)
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.095 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.100 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.128 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.267 182729 DEBUG nova.compute.manager [req-0c1a8e91-0c7f-47aa-9801-e59e97770c3d req-d8b765b2-7601-445b-b08b-feace1931507 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.268 182729 DEBUG oslo_concurrency.lockutils [req-0c1a8e91-0c7f-47aa-9801-e59e97770c3d req-d8b765b2-7601-445b-b08b-feace1931507 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.269 182729 DEBUG oslo_concurrency.lockutils [req-0c1a8e91-0c7f-47aa-9801-e59e97770c3d req-d8b765b2-7601-445b-b08b-feace1931507 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.270 182729 DEBUG oslo_concurrency.lockutils [req-0c1a8e91-0c7f-47aa-9801-e59e97770c3d req-d8b765b2-7601-445b-b08b-feace1931507 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.270 182729 DEBUG nova.compute.manager [req-0c1a8e91-0c7f-47aa-9801-e59e97770c3d req-d8b765b2-7601-445b-b08b-feace1931507 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Processing event network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:46:56 compute-0 podman[234589]: 2026-01-22 22:46:56.414748788 +0000 UTC m=+0.079610362 container create 1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:46:56 compute-0 systemd[1]: Started libpod-conmon-1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9.scope.
Jan 22 22:46:56 compute-0 podman[234589]: 2026-01-22 22:46:56.375305307 +0000 UTC m=+0.040166941 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.489 182729 DEBUG nova.compute.manager [req-7f31065a-1883-4e46-83fe-cf42c5676222 req-dd9aec69-5b55-40d9-b995-e29c7f2b052b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.489 182729 DEBUG oslo_concurrency.lockutils [req-7f31065a-1883-4e46-83fe-cf42c5676222 req-dd9aec69-5b55-40d9-b995-e29c7f2b052b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.490 182729 DEBUG oslo_concurrency.lockutils [req-7f31065a-1883-4e46-83fe-cf42c5676222 req-dd9aec69-5b55-40d9-b995-e29c7f2b052b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.490 182729 DEBUG oslo_concurrency.lockutils [req-7f31065a-1883-4e46-83fe-cf42c5676222 req-dd9aec69-5b55-40d9-b995-e29c7f2b052b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.490 182729 DEBUG nova.compute.manager [req-7f31065a-1883-4e46-83fe-cf42c5676222 req-dd9aec69-5b55-40d9-b995-e29c7f2b052b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Processing event network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.491 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.496 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122016.4958553, e08a106b-5819-44ac-bcad-850c349c17cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.496 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] VM Resumed (Lifecycle Event)
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.497 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:46:56 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.500 182729 INFO nova.virt.libvirt.driver [-] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Instance spawned successfully.
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.501 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:46:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149a517feb78e5daa9e498256278bd160695c0033672d675172f9e27e8fa25be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:46:56 compute-0 podman[234589]: 2026-01-22 22:46:56.521501494 +0000 UTC m=+0.186363078 container init 1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:46:56 compute-0 podman[234589]: 2026-01-22 22:46:56.529204286 +0000 UTC m=+0.194065840 container start 1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.530 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.536 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.539 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.539 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.539 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.540 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.540 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.540 182729 DEBUG nova.virt.libvirt.driver [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:46:56 compute-0 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[234604]: [NOTICE]   (234608) : New worker (234610) forked
Jan 22 22:46:56 compute-0 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[234604]: [NOTICE]   (234608) : Loading success.
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.581 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 355e0aa9-5b7b-417a-a2e2-dea353a114d0 in datapath 09b515c7-d044-43d4-b895-408eb5de1fd8 unbound from our chassis
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.583 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09b515c7-d044-43d4-b895-408eb5de1fd8
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.595 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6aba631f-1609-411c-b31f-8e4915537ac2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.596 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09b515c7-d1 in ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.600 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09b515c7-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.600 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d5efb392-28a3-4049-a3c4-4f13db90b9e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.601 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[458254a9-8710-451f-a7ca-fbc9b24b5397]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.616 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[464db6b6-bbd6-412d-a8f4-f85649d43427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.629 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.632 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e660c9-17a2-4cca-af03-fb9d7ca6310f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.675 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[52ac492e-c822-4f42-9e67-969ad1fce59a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 NetworkManager[54954]: <info>  [1769122016.6823] manager: (tap09b515c7-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/295)
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.680 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[997f767a-06b2-454b-a2f4-4e21510a43d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.687 182729 INFO nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Took 12.04 seconds to spawn the instance on the hypervisor.
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.687 182729 DEBUG nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.730 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d967577f-8669-4886-ad8f-c94f343e66bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.734 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4351853a-c28b-4dd5-9e34-0de192050181]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 NetworkManager[54954]: <info>  [1769122016.7654] device (tap09b515c7-d0): carrier: link connected
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.774 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e90f05-8e10-4adf-be01-7b767e829891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.800 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dba373c0-9d2f-465d-a2ca-5ab3528fa3bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09b515c7-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:57:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565436, 'reachable_time': 32335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234629, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.829 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[23413d0a-21ab-4b6d-b5c5-71faa0611834]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe60:579d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565436, 'tstamp': 565436}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234630, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.856 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca521b5-10c3-40c6-bcc8-8764612c5383]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09b515c7-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:57:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 195], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565436, 'reachable_time': 32335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234631, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.900 182729 INFO nova.compute.manager [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Took 12.81 seconds to build instance.
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.909 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4b205701-e208-48ba-b26e-3191069cce66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.923 182729 DEBUG oslo_concurrency.lockutils [None req-cd907495-3d47-4c98-95c2-be6230f76d3b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.949 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b048d3-7a93-4e6c-b616-7cfad34bffb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.952 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09b515c7-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.953 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.953 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09b515c7-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.956 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:56 compute-0 kernel: tap09b515c7-d0: entered promiscuous mode
Jan 22 22:46:56 compute-0 NetworkManager[54954]: <info>  [1769122016.9577] manager: (tap09b515c7-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.958 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.963 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.965 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09b515c7-d0, col_values=(('external_ids', {'iface-id': 'f20a608b-4dde-4090-8331-5a96db0eeb25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.966 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:56 compute-0 ovn_controller[94850]: 2026-01-22T22:46:56Z|00645|binding|INFO|Releasing lport f20a608b-4dde-4090-8331-5a96db0eeb25 from this chassis (sb_readonly=0)
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.968 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.970 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09b515c7-d044-43d4-b895-408eb5de1fd8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09b515c7-d044-43d4-b895-408eb5de1fd8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.971 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2a5e26-0eb9-4961-9926-93eec9ded124]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.972 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-09b515c7-d044-43d4-b895-408eb5de1fd8
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/09b515c7-d044-43d4-b895-408eb5de1fd8.pid.haproxy
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 09b515c7-d044-43d4-b895-408eb5de1fd8
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:46:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:56.973 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'env', 'PROCESS_TAG=haproxy-09b515c7-d044-43d4-b895-408eb5de1fd8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09b515c7-d044-43d4-b895-408eb5de1fd8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:46:56 compute-0 nova_compute[182725]: 2026-01-22 22:46:56.992 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:57 compute-0 podman[234662]: 2026-01-22 22:46:57.407722365 +0000 UTC m=+0.050574759 container create e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:46:57 compute-0 systemd[1]: Started libpod-conmon-e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419.scope.
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.461 182729 DEBUG nova.network.neutron [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updated VIF entry in instance network info cache for port 9ca80ea8-671c-4688-a4d3-26fc656e645e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.461 182729 DEBUG nova.network.neutron [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updating instance_info_cache with network_info: [{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:46:57 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:46:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1721badf8709fb832e30cddd195fe16a8b68c61607d8b228befae179d14d96e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:46:57 compute-0 podman[234662]: 2026-01-22 22:46:57.382946769 +0000 UTC m=+0.025799193 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:46:57 compute-0 podman[234662]: 2026-01-22 22:46:57.49151112 +0000 UTC m=+0.134363614 container init e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 22:46:57 compute-0 podman[234662]: 2026-01-22 22:46:57.500196187 +0000 UTC m=+0.143048631 container start e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.501 182729 DEBUG oslo_concurrency.lockutils [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.502 182729 DEBUG nova.compute.manager [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-changed-355e0aa9-5b7b-417a-a2e2-dea353a114d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.503 182729 DEBUG nova.compute.manager [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing instance network info cache due to event network-changed-355e0aa9-5b7b-417a-a2e2-dea353a114d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.503 182729 DEBUG oslo_concurrency.lockutils [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.504 182729 DEBUG oslo_concurrency.lockutils [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:46:57 compute-0 nova_compute[182725]: 2026-01-22 22:46:57.505 182729 DEBUG nova.network.neutron [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing network info cache for port 355e0aa9-5b7b-417a-a2e2-dea353a114d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:46:57 compute-0 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[234678]: [NOTICE]   (234682) : New worker (234684) forked
Jan 22 22:46:57 compute-0 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[234678]: [NOTICE]   (234682) : Loading success.
Jan 22 22:46:57 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:57.569 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.446 182729 DEBUG nova.compute.manager [req-e3f795fc-4ba1-4edc-a7e9-db11f2a74721 req-62886723-a37e-4e0f-81da-0754aa398c4e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.447 182729 DEBUG oslo_concurrency.lockutils [req-e3f795fc-4ba1-4edc-a7e9-db11f2a74721 req-62886723-a37e-4e0f-81da-0754aa398c4e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.448 182729 DEBUG oslo_concurrency.lockutils [req-e3f795fc-4ba1-4edc-a7e9-db11f2a74721 req-62886723-a37e-4e0f-81da-0754aa398c4e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.448 182729 DEBUG oslo_concurrency.lockutils [req-e3f795fc-4ba1-4edc-a7e9-db11f2a74721 req-62886723-a37e-4e0f-81da-0754aa398c4e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.449 182729 DEBUG nova.compute.manager [req-e3f795fc-4ba1-4edc-a7e9-db11f2a74721 req-62886723-a37e-4e0f-81da-0754aa398c4e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] No waiting events found dispatching network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.449 182729 WARNING nova.compute.manager [req-e3f795fc-4ba1-4edc-a7e9-db11f2a74721 req-62886723-a37e-4e0f-81da-0754aa398c4e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received unexpected event network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 for instance with vm_state active and task_state None.
Jan 22 22:46:58 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:46:58.572 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.708 182729 DEBUG nova.compute.manager [req-d72b854f-fcd3-4a31-8adb-b9fc8b72c771 req-41a5721e-ec09-4c53-9152-88df139cfe34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.710 182729 DEBUG oslo_concurrency.lockutils [req-d72b854f-fcd3-4a31-8adb-b9fc8b72c771 req-41a5721e-ec09-4c53-9152-88df139cfe34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.711 182729 DEBUG oslo_concurrency.lockutils [req-d72b854f-fcd3-4a31-8adb-b9fc8b72c771 req-41a5721e-ec09-4c53-9152-88df139cfe34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.711 182729 DEBUG oslo_concurrency.lockutils [req-d72b854f-fcd3-4a31-8adb-b9fc8b72c771 req-41a5721e-ec09-4c53-9152-88df139cfe34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.712 182729 DEBUG nova.compute.manager [req-d72b854f-fcd3-4a31-8adb-b9fc8b72c771 req-41a5721e-ec09-4c53-9152-88df139cfe34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] No waiting events found dispatching network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:46:58 compute-0 nova_compute[182725]: 2026-01-22 22:46:58.713 182729 WARNING nova.compute.manager [req-d72b854f-fcd3-4a31-8adb-b9fc8b72c771 req-41a5721e-ec09-4c53-9152-88df139cfe34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received unexpected event network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e for instance with vm_state active and task_state None.
Jan 22 22:46:59 compute-0 nova_compute[182725]: 2026-01-22 22:46:59.401 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:46:59 compute-0 nova_compute[182725]: 2026-01-22 22:46:59.529 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:03 compute-0 nova_compute[182725]: 2026-01-22 22:47:03.581 182729 DEBUG nova.network.neutron [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updated VIF entry in instance network info cache for port 355e0aa9-5b7b-417a-a2e2-dea353a114d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:47:03 compute-0 nova_compute[182725]: 2026-01-22 22:47:03.583 182729 DEBUG nova.network.neutron [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updating instance_info_cache with network_info: [{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:47:03 compute-0 nova_compute[182725]: 2026-01-22 22:47:03.626 182729 DEBUG oslo_concurrency.lockutils [req-b8b01313-863b-4314-ae5e-70030dd1de2d req-68094151-79f4-4c73-8fc9-bf6098b13509 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:47:04 compute-0 nova_compute[182725]: 2026-01-22 22:47:04.236 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:04 compute-0 NetworkManager[54954]: <info>  [1769122024.2381] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Jan 22 22:47:04 compute-0 NetworkManager[54954]: <info>  [1769122024.2399] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Jan 22 22:47:04 compute-0 nova_compute[182725]: 2026-01-22 22:47:04.293 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:04 compute-0 ovn_controller[94850]: 2026-01-22T22:47:04Z|00646|binding|INFO|Releasing lport f20a608b-4dde-4090-8331-5a96db0eeb25 from this chassis (sb_readonly=0)
Jan 22 22:47:04 compute-0 ovn_controller[94850]: 2026-01-22T22:47:04Z|00647|binding|INFO|Releasing lport 781e89a1-cc2a-4079-9a3d-fcfacb1013c1 from this chassis (sb_readonly=0)
Jan 22 22:47:04 compute-0 nova_compute[182725]: 2026-01-22 22:47:04.316 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:04 compute-0 nova_compute[182725]: 2026-01-22 22:47:04.403 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:04 compute-0 nova_compute[182725]: 2026-01-22 22:47:04.531 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:05 compute-0 nova_compute[182725]: 2026-01-22 22:47:05.206 182729 DEBUG nova.compute.manager [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-changed-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:47:05 compute-0 nova_compute[182725]: 2026-01-22 22:47:05.207 182729 DEBUG nova.compute.manager [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing instance network info cache due to event network-changed-9ca80ea8-671c-4688-a4d3-26fc656e645e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:47:05 compute-0 nova_compute[182725]: 2026-01-22 22:47:05.208 182729 DEBUG oslo_concurrency.lockutils [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:47:05 compute-0 nova_compute[182725]: 2026-01-22 22:47:05.209 182729 DEBUG oslo_concurrency.lockutils [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:47:05 compute-0 nova_compute[182725]: 2026-01-22 22:47:05.209 182729 DEBUG nova.network.neutron [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing network info cache for port 9ca80ea8-671c-4688-a4d3-26fc656e645e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:47:07 compute-0 nova_compute[182725]: 2026-01-22 22:47:07.540 182729 DEBUG nova.network.neutron [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updated VIF entry in instance network info cache for port 9ca80ea8-671c-4688-a4d3-26fc656e645e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:47:07 compute-0 nova_compute[182725]: 2026-01-22 22:47:07.541 182729 DEBUG nova.network.neutron [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updating instance_info_cache with network_info: [{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:47:07 compute-0 nova_compute[182725]: 2026-01-22 22:47:07.568 182729 DEBUG oslo_concurrency.lockutils [req-36b92dec-cd30-4172-801a-003def90625d req-2669b4cf-5d75-4b1c-bb43-40c10cc1a910 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:47:08 compute-0 podman[234719]: 2026-01-22 22:47:08.185895618 +0000 UTC m=+0.098297547 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:47:08 compute-0 ovn_controller[94850]: 2026-01-22T22:47:08Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e6:b4:50 10.100.0.14
Jan 22 22:47:08 compute-0 ovn_controller[94850]: 2026-01-22T22:47:08Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e6:b4:50 10.100.0.14
Jan 22 22:47:09 compute-0 nova_compute[182725]: 2026-01-22 22:47:09.406 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:09 compute-0 nova_compute[182725]: 2026-01-22 22:47:09.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:12.455 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:12.456 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:12.456 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:14 compute-0 podman[234742]: 2026-01-22 22:47:14.146713784 +0000 UTC m=+0.072060894 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Jan 22 22:47:14 compute-0 podman[234741]: 2026-01-22 22:47:14.179741366 +0000 UTC m=+0.110398838 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 22:47:14 compute-0 nova_compute[182725]: 2026-01-22 22:47:14.410 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:14 compute-0 nova_compute[182725]: 2026-01-22 22:47:14.539 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:17 compute-0 ovn_controller[94850]: 2026-01-22T22:47:17Z|00648|binding|INFO|Releasing lport f20a608b-4dde-4090-8331-5a96db0eeb25 from this chassis (sb_readonly=0)
Jan 22 22:47:17 compute-0 ovn_controller[94850]: 2026-01-22T22:47:17Z|00649|binding|INFO|Releasing lport 781e89a1-cc2a-4079-9a3d-fcfacb1013c1 from this chassis (sb_readonly=0)
Jan 22 22:47:17 compute-0 nova_compute[182725]: 2026-01-22 22:47:17.899 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:18 compute-0 nova_compute[182725]: 2026-01-22 22:47:18.520 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:19 compute-0 nova_compute[182725]: 2026-01-22 22:47:19.412 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:19 compute-0 nova_compute[182725]: 2026-01-22 22:47:19.541 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:22 compute-0 nova_compute[182725]: 2026-01-22 22:47:22.346 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:22 compute-0 nova_compute[182725]: 2026-01-22 22:47:22.639 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:24 compute-0 podman[234784]: 2026-01-22 22:47:24.116699957 +0000 UTC m=+0.051417830 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:47:24 compute-0 podman[234786]: 2026-01-22 22:47:24.131478625 +0000 UTC m=+0.054426595 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:47:24 compute-0 podman[234785]: 2026-01-22 22:47:24.144500749 +0000 UTC m=+0.077512499 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 22:47:24 compute-0 nova_compute[182725]: 2026-01-22 22:47:24.416 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:24 compute-0 nova_compute[182725]: 2026-01-22 22:47:24.543 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:24 compute-0 nova_compute[182725]: 2026-01-22 22:47:24.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:28 compute-0 nova_compute[182725]: 2026-01-22 22:47:28.238 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:29 compute-0 nova_compute[182725]: 2026-01-22 22:47:29.196 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:29 compute-0 nova_compute[182725]: 2026-01-22 22:47:29.417 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:29 compute-0 nova_compute[182725]: 2026-01-22 22:47:29.545 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:33 compute-0 nova_compute[182725]: 2026-01-22 22:47:33.951 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:34 compute-0 nova_compute[182725]: 2026-01-22 22:47:34.420 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:34 compute-0 nova_compute[182725]: 2026-01-22 22:47:34.547 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:34 compute-0 nova_compute[182725]: 2026-01-22 22:47:34.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:34 compute-0 nova_compute[182725]: 2026-01-22 22:47:34.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:47:34 compute-0 nova_compute[182725]: 2026-01-22 22:47:34.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:47:35 compute-0 nova_compute[182725]: 2026-01-22 22:47:35.159 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:47:35 compute-0 nova_compute[182725]: 2026-01-22 22:47:35.159 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:47:35 compute-0 nova_compute[182725]: 2026-01-22 22:47:35.160 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:47:35 compute-0 nova_compute[182725]: 2026-01-22 22:47:35.160 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e08a106b-5819-44ac-bcad-850c349c17cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.411 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updating instance_info_cache with network_info: [{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.438 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.439 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.440 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.440 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.440 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.441 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.469 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.470 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.470 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.470 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.555 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:38 compute-0 podman[234848]: 2026-01-22 22:47:38.598107507 +0000 UTC m=+0.076789922 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.649 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.651 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.727 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.890 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.891 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5520MB free_disk=73.28820419311523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.891 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.892 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.975 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance e08a106b-5819-44ac-bcad-850c349c17cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.976 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:47:38 compute-0 nova_compute[182725]: 2026-01-22 22:47:38.976 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:47:39 compute-0 nova_compute[182725]: 2026-01-22 22:47:39.047 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:47:39 compute-0 nova_compute[182725]: 2026-01-22 22:47:39.059 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:47:39 compute-0 nova_compute[182725]: 2026-01-22 22:47:39.085 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:47:39 compute-0 nova_compute[182725]: 2026-01-22 22:47:39.086 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:39 compute-0 nova_compute[182725]: 2026-01-22 22:47:39.422 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:39 compute-0 nova_compute[182725]: 2026-01-22 22:47:39.550 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.534 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.852 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.852 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.869 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.978 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.979 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.986 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:47:40 compute-0 nova_compute[182725]: 2026-01-22 22:47:40.986 182729 INFO nova.compute.claims [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.154 182729 DEBUG nova.compute.provider_tree [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.176 182729 DEBUG nova.scheduler.client.report [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.204 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.206 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.291 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.291 182729 DEBUG nova.network.neutron [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.324 182729 INFO nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.349 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.480 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.482 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.483 182729 INFO nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Creating image(s)
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.484 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "/var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.485 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "/var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.486 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "/var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.512 182729 DEBUG nova.policy [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.517 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.596 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.598 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.599 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.624 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.680 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.682 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.732 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.734 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.737 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.831 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.832 182729 DEBUG nova.virt.disk.api [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Checking if we can resize image /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.833 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.897 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.898 182729 DEBUG nova.virt.disk.api [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Cannot resize image /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.899 182729 DEBUG nova.objects.instance [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lazy-loading 'migration_context' on Instance uuid cb8ac1a6-ad25-4019-add5-64c347b769cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.913 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.913 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Ensure instance console log exists: /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.914 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.914 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:41 compute-0 nova_compute[182725]: 2026-01-22 22:47:41.914 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:42 compute-0 nova_compute[182725]: 2026-01-22 22:47:42.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:43 compute-0 nova_compute[182725]: 2026-01-22 22:47:43.410 182729 DEBUG nova.network.neutron [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Successfully created port: a585bd6a-2858-42c6-a61d-e2f5ae701b22 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:47:44 compute-0 nova_compute[182725]: 2026-01-22 22:47:44.425 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:44 compute-0 nova_compute[182725]: 2026-01-22 22:47:44.552 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:45 compute-0 podman[234891]: 2026-01-22 22:47:45.151917838 +0000 UTC m=+0.070479005 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64)
Jan 22 22:47:45 compute-0 podman[234890]: 2026-01-22 22:47:45.199746898 +0000 UTC m=+0.127002741 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 22:47:45 compute-0 nova_compute[182725]: 2026-01-22 22:47:45.750 182729 DEBUG nova.network.neutron [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Successfully updated port: a585bd6a-2858-42c6-a61d-e2f5ae701b22 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:47:45 compute-0 nova_compute[182725]: 2026-01-22 22:47:45.769 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:47:45 compute-0 nova_compute[182725]: 2026-01-22 22:47:45.769 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquired lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:47:45 compute-0 nova_compute[182725]: 2026-01-22 22:47:45.769 182729 DEBUG nova.network.neutron [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:47:45 compute-0 nova_compute[182725]: 2026-01-22 22:47:45.824 182729 DEBUG nova.compute.manager [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-changed-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:47:45 compute-0 nova_compute[182725]: 2026-01-22 22:47:45.825 182729 DEBUG nova.compute.manager [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Refreshing instance network info cache due to event network-changed-a585bd6a-2858-42c6-a61d-e2f5ae701b22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:47:45 compute-0 nova_compute[182725]: 2026-01-22 22:47:45.825 182729 DEBUG oslo_concurrency.lockutils [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:47:46 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.244 182729 DEBUG nova.network.neutron [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:47:46 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.969 182729 DEBUG nova.network.neutron [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updating instance_info_cache with network_info: [{"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:47:46 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.986 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Releasing lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:47:46 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.987 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Instance network_info: |[{"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:47:46 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.987 182729 DEBUG oslo_concurrency.lockutils [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:47:46 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.988 182729 DEBUG nova.network.neutron [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Refreshing network info cache for port a585bd6a-2858-42c6-a61d-e2f5ae701b22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:47:46 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.993 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Start _get_guest_xml network_info=[{"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:46.999 182729 WARNING nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.013 182729 DEBUG nova.virt.libvirt.host [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.014 182729 DEBUG nova.virt.libvirt.host [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.019 182729 DEBUG nova.virt.libvirt.host [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.020 182729 DEBUG nova.virt.libvirt.host [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.022 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.022 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.023 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.024 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.025 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.025 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.026 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.026 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.027 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.027 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.028 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.028 182729 DEBUG nova.virt.hardware [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.036 182729 DEBUG nova.virt.libvirt.vif [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:47:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-207363830',display_name='tempest-TestSnapshotPattern-server-207363830',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-207363830',id=161,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKRyYAeVMNtC/j+MZtGGRG3eEEuekA15beqoQuOmHPR2UVmKb27cRtpJBpme1vKuXPPT5TKSpNW135l4FCwnjJAiPROKVyFsz2cyOtDZC0vbf3qqtMTnPoOqjT3eszeS0g==',key_name='tempest-TestSnapshotPattern-592124350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a0876e1a4cab4f9997487dc31953aafd',ramdisk_id='',reservation_id='r-5461ovew',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1578752051',owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:47:41Z,user_data=None,user_id='abbb13a7c01949c8b45e4e3263026c12',uuid=cb8ac1a6-ad25-4019-add5-64c347b769cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.037 182729 DEBUG nova.network.os_vif_util [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converting VIF {"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.039 182729 DEBUG nova.network.os_vif_util [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:da:ab,bridge_name='br-int',has_traffic_filtering=True,id=a585bd6a-2858-42c6-a61d-e2f5ae701b22,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa585bd6a-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.041 182729 DEBUG nova.objects.instance [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lazy-loading 'pci_devices' on Instance uuid cb8ac1a6-ad25-4019-add5-64c347b769cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.055 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <uuid>cb8ac1a6-ad25-4019-add5-64c347b769cb</uuid>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <name>instance-000000a1</name>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <nova:name>tempest-TestSnapshotPattern-server-207363830</nova:name>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:47:47</nova:creationTime>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:user uuid="abbb13a7c01949c8b45e4e3263026c12">tempest-TestSnapshotPattern-1578752051-project-member</nova:user>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:project uuid="a0876e1a4cab4f9997487dc31953aafd">tempest-TestSnapshotPattern-1578752051</nova:project>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         <nova:port uuid="a585bd6a-2858-42c6-a61d-e2f5ae701b22">
Jan 22 22:47:47 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <system>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <entry name="serial">cb8ac1a6-ad25-4019-add5-64c347b769cb</entry>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <entry name="uuid">cb8ac1a6-ad25-4019-add5-64c347b769cb</entry>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </system>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <os>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   </os>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <features>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   </features>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.config"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:67:da:ab"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <target dev="tapa585bd6a-28"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/console.log" append="off"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <video>
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </video>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:47:47 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:47:47 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:47:47 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:47:47 compute-0 nova_compute[182725]: </domain>
Jan 22 22:47:47 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.057 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Preparing to wait for external event network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.057 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.058 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.058 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.059 182729 DEBUG nova.virt.libvirt.vif [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:47:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-207363830',display_name='tempest-TestSnapshotPattern-server-207363830',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-207363830',id=161,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKRyYAeVMNtC/j+MZtGGRG3eEEuekA15beqoQuOmHPR2UVmKb27cRtpJBpme1vKuXPPT5TKSpNW135l4FCwnjJAiPROKVyFsz2cyOtDZC0vbf3qqtMTnPoOqjT3eszeS0g==',key_name='tempest-TestSnapshotPattern-592124350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a0876e1a4cab4f9997487dc31953aafd',ramdisk_id='',reservation_id='r-5461ovew',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1578752051',owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:47:41Z,user_data=None,user_id='abbb13a7c01949c8b45e4e3263026c12',uuid=cb8ac1a6-ad25-4019-add5-64c347b769cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.059 182729 DEBUG nova.network.os_vif_util [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converting VIF {"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.060 182729 DEBUG nova.network.os_vif_util [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:da:ab,bridge_name='br-int',has_traffic_filtering=True,id=a585bd6a-2858-42c6-a61d-e2f5ae701b22,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa585bd6a-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.061 182729 DEBUG os_vif [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:da:ab,bridge_name='br-int',has_traffic_filtering=True,id=a585bd6a-2858-42c6-a61d-e2f5ae701b22,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa585bd6a-28') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.061 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.062 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.062 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.065 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.066 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa585bd6a-28, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.066 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa585bd6a-28, col_values=(('external_ids', {'iface-id': 'a585bd6a-2858-42c6-a61d-e2f5ae701b22', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:67:da:ab', 'vm-uuid': 'cb8ac1a6-ad25-4019-add5-64c347b769cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.068 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:47 compute-0 NetworkManager[54954]: <info>  [1769122067.0691] manager: (tapa585bd6a-28): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.070 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.080 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.082 182729 INFO os_vif [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:da:ab,bridge_name='br-int',has_traffic_filtering=True,id=a585bd6a-2858-42c6-a61d-e2f5ae701b22,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa585bd6a-28')
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.146 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.147 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.147 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] No VIF found with MAC fa:16:3e:67:da:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:47:47 compute-0 nova_compute[182725]: 2026-01-22 22:47:47.148 182729 INFO nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Using config drive
Jan 22 22:47:48 compute-0 nova_compute[182725]: 2026-01-22 22:47:48.922 182729 INFO nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Creating config drive at /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.config
Jan 22 22:47:48 compute-0 nova_compute[182725]: 2026-01-22 22:47:48.928 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplioy27qo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.058 182729 DEBUG oslo_concurrency.processutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplioy27qo" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:47:49 compute-0 kernel: tapa585bd6a-28: entered promiscuous mode
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.122 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:49 compute-0 NetworkManager[54954]: <info>  [1769122069.1226] manager: (tapa585bd6a-28): new Tun device (/org/freedesktop/NetworkManager/Devices/300)
Jan 22 22:47:49 compute-0 ovn_controller[94850]: 2026-01-22T22:47:49Z|00650|binding|INFO|Claiming lport a585bd6a-2858-42c6-a61d-e2f5ae701b22 for this chassis.
Jan 22 22:47:49 compute-0 ovn_controller[94850]: 2026-01-22T22:47:49Z|00651|binding|INFO|a585bd6a-2858-42c6-a61d-e2f5ae701b22: Claiming fa:16:3e:67:da:ab 10.100.0.8
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.130 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:da:ab 10.100.0.8'], port_security=['fa:16:3e:67:da:ab 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f14033-82f9-4533-a194-36532baa893b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '039e09b7-4927-4c69-bb9d-1012bf4a1d89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c5e7990-8af4-4ab4-b8e4-c75ffda3dd74, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=a585bd6a-2858-42c6-a61d-e2f5ae701b22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.131 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a585bd6a-2858-42c6-a61d-e2f5ae701b22 in datapath f3f14033-82f9-4533-a194-36532baa893b bound to our chassis
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.133 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f3f14033-82f9-4533-a194-36532baa893b
Jan 22 22:47:49 compute-0 ovn_controller[94850]: 2026-01-22T22:47:49Z|00652|binding|INFO|Setting lport a585bd6a-2858-42c6-a61d-e2f5ae701b22 ovn-installed in OVS
Jan 22 22:47:49 compute-0 ovn_controller[94850]: 2026-01-22T22:47:49Z|00653|binding|INFO|Setting lport a585bd6a-2858-42c6-a61d-e2f5ae701b22 up in Southbound
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.138 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.141 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.145 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0f70d209-8c2a-4164-8264-dfd67820e18d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.146 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf3f14033-81 in ovnmeta-f3f14033-82f9-4533-a194-36532baa893b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.148 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf3f14033-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.148 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9f8220-270b-48bf-b74e-554146714081]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.149 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a6b1ef-95a6-438f-ad20-9e0bd8741e47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 systemd-udevd[234958]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:47:49 compute-0 systemd-machined[154006]: New machine qemu-71-instance-000000a1.
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.161 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[1638a240-826c-4fae-b1fe-354ab48a68b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 NetworkManager[54954]: <info>  [1769122069.1687] device (tapa585bd6a-28): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:47:49 compute-0 NetworkManager[54954]: <info>  [1769122069.1694] device (tapa585bd6a-28): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:47:49 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-000000a1.
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.186 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[afd00262-6c8f-4b60-87cf-67e0a265cce0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.213 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[6927535e-1c5e-4029-a36f-17cc34999348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 NetworkManager[54954]: <info>  [1769122069.2195] manager: (tapf3f14033-80): new Veth device (/org/freedesktop/NetworkManager/Devices/301)
Jan 22 22:47:49 compute-0 systemd-udevd[234962]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.218 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[83b18a63-15ed-43e1-9677-87212b645c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.250 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[80967ebf-629b-45c9-b8b5-bf6e66edcd58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.254 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[bd5028a7-6994-4dfe-b77e-8f1681f100c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 NetworkManager[54954]: <info>  [1769122069.2742] device (tapf3f14033-80): carrier: link connected
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.279 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[2547dcfe-070d-49e2-a566-bf2243d92a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.296 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[994c31a5-c87f-462a-989c-b3a53c6bd8cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3f14033-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:30:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570687, 'reachable_time': 22921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234990, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.311 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[164f6727-9da9-41b1-a4a4-8c65d7a3d82e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:3086'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570687, 'tstamp': 570687}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234996, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.329 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5266fb-4ef9-4911-b4c9-1eeabdd23752]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3f14033-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:30:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570687, 'reachable_time': 22921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234998, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.358 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[120d0949-7b79-4b83-9285-23dadc97c9c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.384 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122069.3840265, cb8ac1a6-ad25-4019-add5-64c347b769cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.384 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] VM Started (Lifecycle Event)
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.408 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.409 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a788b245-e7ee-4bd5-a06f-bb325f88c51d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.411 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3f14033-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.411 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.412 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3f14033-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.412 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122069.384194, cb8ac1a6-ad25-4019-add5-64c347b769cb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.412 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] VM Paused (Lifecycle Event)
Jan 22 22:47:49 compute-0 NetworkManager[54954]: <info>  [1769122069.4150] manager: (tapf3f14033-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 22 22:47:49 compute-0 kernel: tapf3f14033-80: entered promiscuous mode
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.415 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.417 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.418 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf3f14033-80, col_values=(('external_ids', {'iface-id': '941474a6-10cc-4642-b048-e5e47f4d8a09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.419 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:49 compute-0 ovn_controller[94850]: 2026-01-22T22:47:49Z|00654|binding|INFO|Releasing lport 941474a6-10cc-4642-b048-e5e47f4d8a09 from this chassis (sb_readonly=0)
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.431 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.431 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f3f14033-82f9-4533-a194-36532baa893b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f3f14033-82f9-4533-a194-36532baa893b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.432 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[43ee4448-af87-4533-b8c3-b4230ec9c4cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.433 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-f3f14033-82f9-4533-a194-36532baa893b
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/f3f14033-82f9-4533-a194-36532baa893b.pid.haproxy
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID f3f14033-82f9-4533-a194-36532baa893b
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:47:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:47:49.434 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'env', 'PROCESS_TAG=haproxy-f3f14033-82f9-4533-a194-36532baa893b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f3f14033-82f9-4533-a194-36532baa893b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.441 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.444 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:47:49 compute-0 nova_compute[182725]: 2026-01-22 22:47:49.465 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:47:49 compute-0 podman[235031]: 2026-01-22 22:47:49.769015929 +0000 UTC m=+0.049510433 container create beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:47:49 compute-0 systemd[1]: Started libpod-conmon-beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629.scope.
Jan 22 22:47:49 compute-0 podman[235031]: 2026-01-22 22:47:49.744816517 +0000 UTC m=+0.025311041 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:47:49 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:47:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd4e0fa1b1e393c262bce6eebc08dc8880bfe58ba865f16b97ab7a09ba88541e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:47:49 compute-0 podman[235031]: 2026-01-22 22:47:49.865928011 +0000 UTC m=+0.146422525 container init beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 22:47:49 compute-0 podman[235031]: 2026-01-22 22:47:49.871176031 +0000 UTC m=+0.151670525 container start beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:47:49 compute-0 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[235046]: [NOTICE]   (235050) : New worker (235052) forked
Jan 22 22:47:49 compute-0 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[235046]: [NOTICE]   (235050) : Loading success.
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.036 182729 DEBUG nova.compute.manager [req-f96815a2-f006-4eeb-8fe6-0894592c0ecd req-15462df3-f8c2-48e9-86b5-0f73b86326f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.037 182729 DEBUG oslo_concurrency.lockutils [req-f96815a2-f006-4eeb-8fe6-0894592c0ecd req-15462df3-f8c2-48e9-86b5-0f73b86326f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.037 182729 DEBUG oslo_concurrency.lockutils [req-f96815a2-f006-4eeb-8fe6-0894592c0ecd req-15462df3-f8c2-48e9-86b5-0f73b86326f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.038 182729 DEBUG oslo_concurrency.lockutils [req-f96815a2-f006-4eeb-8fe6-0894592c0ecd req-15462df3-f8c2-48e9-86b5-0f73b86326f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.038 182729 DEBUG nova.compute.manager [req-f96815a2-f006-4eeb-8fe6-0894592c0ecd req-15462df3-f8c2-48e9-86b5-0f73b86326f9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Processing event network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.040 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.044 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122071.0446029, cb8ac1a6-ad25-4019-add5-64c347b769cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.045 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] VM Resumed (Lifecycle Event)
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.046 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.050 182729 INFO nova.virt.libvirt.driver [-] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Instance spawned successfully.
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.050 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.065 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.070 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.074 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.074 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.075 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.075 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.075 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.076 182729 DEBUG nova.virt.libvirt.driver [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.105 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.142 182729 INFO nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Took 9.66 seconds to spawn the instance on the hypervisor.
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.143 182729 DEBUG nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.217 182729 INFO nova.compute.manager [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Took 10.27 seconds to build instance.
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.235 182729 DEBUG oslo_concurrency.lockutils [None req-f7ebf4cb-27c9-4b14-83eb-61dedf709773 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.298 182729 DEBUG nova.network.neutron [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updated VIF entry in instance network info cache for port a585bd6a-2858-42c6-a61d-e2f5ae701b22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.299 182729 DEBUG nova.network.neutron [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updating instance_info_cache with network_info: [{"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.320 182729 DEBUG oslo_concurrency.lockutils [req-801e5385-08ac-464a-a705-c80d834ea186 req-8f0698df-da7d-468d-ad6f-9fd5ae570bf3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:47:51 compute-0 nova_compute[182725]: 2026-01-22 22:47:51.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:47:52 compute-0 nova_compute[182725]: 2026-01-22 22:47:52.069 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:53 compute-0 nova_compute[182725]: 2026-01-22 22:47:53.116 182729 DEBUG nova.compute.manager [req-61c0c1d5-824a-4792-845c-cabbf896e720 req-07594d97-13fb-4868-8336-b9051992585f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:47:53 compute-0 nova_compute[182725]: 2026-01-22 22:47:53.116 182729 DEBUG oslo_concurrency.lockutils [req-61c0c1d5-824a-4792-845c-cabbf896e720 req-07594d97-13fb-4868-8336-b9051992585f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:47:53 compute-0 nova_compute[182725]: 2026-01-22 22:47:53.117 182729 DEBUG oslo_concurrency.lockutils [req-61c0c1d5-824a-4792-845c-cabbf896e720 req-07594d97-13fb-4868-8336-b9051992585f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:47:53 compute-0 nova_compute[182725]: 2026-01-22 22:47:53.117 182729 DEBUG oslo_concurrency.lockutils [req-61c0c1d5-824a-4792-845c-cabbf896e720 req-07594d97-13fb-4868-8336-b9051992585f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:47:53 compute-0 nova_compute[182725]: 2026-01-22 22:47:53.117 182729 DEBUG nova.compute.manager [req-61c0c1d5-824a-4792-845c-cabbf896e720 req-07594d97-13fb-4868-8336-b9051992585f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] No waiting events found dispatching network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:47:53 compute-0 nova_compute[182725]: 2026-01-22 22:47:53.117 182729 WARNING nova.compute.manager [req-61c0c1d5-824a-4792-845c-cabbf896e720 req-07594d97-13fb-4868-8336-b9051992585f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received unexpected event network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 for instance with vm_state active and task_state None.
Jan 22 22:47:54 compute-0 nova_compute[182725]: 2026-01-22 22:47:54.434 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:55 compute-0 podman[235061]: 2026-01-22 22:47:55.125238753 +0000 UTC m=+0.061059100 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:47:55 compute-0 podman[235062]: 2026-01-22 22:47:55.140896723 +0000 UTC m=+0.075871749 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:47:55 compute-0 podman[235063]: 2026-01-22 22:47:55.150906722 +0000 UTC m=+0.072282470 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:47:57 compute-0 nova_compute[182725]: 2026-01-22 22:47:57.072 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:47:57 compute-0 nova_compute[182725]: 2026-01-22 22:47:57.095 182729 DEBUG nova.compute.manager [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-changed-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:47:57 compute-0 nova_compute[182725]: 2026-01-22 22:47:57.095 182729 DEBUG nova.compute.manager [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Refreshing instance network info cache due to event network-changed-a585bd6a-2858-42c6-a61d-e2f5ae701b22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:47:57 compute-0 nova_compute[182725]: 2026-01-22 22:47:57.096 182729 DEBUG oslo_concurrency.lockutils [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:47:57 compute-0 nova_compute[182725]: 2026-01-22 22:47:57.096 182729 DEBUG oslo_concurrency.lockutils [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:47:57 compute-0 nova_compute[182725]: 2026-01-22 22:47:57.097 182729 DEBUG nova.network.neutron [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Refreshing network info cache for port a585bd6a-2858-42c6-a61d-e2f5ae701b22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:47:58 compute-0 nova_compute[182725]: 2026-01-22 22:47:58.191 182729 DEBUG nova.network.neutron [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updated VIF entry in instance network info cache for port a585bd6a-2858-42c6-a61d-e2f5ae701b22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:47:58 compute-0 nova_compute[182725]: 2026-01-22 22:47:58.192 182729 DEBUG nova.network.neutron [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updating instance_info_cache with network_info: [{"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:47:58 compute-0 nova_compute[182725]: 2026-01-22 22:47:58.216 182729 DEBUG oslo_concurrency.lockutils [req-800adb62-c998-48ad-bc29-95f7093706f0 req-6aeff7cb-5fce-49cc-a376-a97a5908cff7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:47:59 compute-0 nova_compute[182725]: 2026-01-22 22:47:59.436 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:02 compute-0 nova_compute[182725]: 2026-01-22 22:48:02.074 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:03 compute-0 nova_compute[182725]: 2026-01-22 22:48:03.303 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:03.303 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:48:03 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:03.306 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:48:04 compute-0 ovn_controller[94850]: 2026-01-22T22:48:04Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:67:da:ab 10.100.0.8
Jan 22 22:48:04 compute-0 ovn_controller[94850]: 2026-01-22T22:48:04Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:67:da:ab 10.100.0.8
Jan 22 22:48:04 compute-0 nova_compute[182725]: 2026-01-22 22:48:04.439 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.308 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.403 182729 DEBUG nova.compute.manager [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-changed-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.404 182729 DEBUG nova.compute.manager [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing instance network info cache due to event network-changed-9ca80ea8-671c-4688-a4d3-26fc656e645e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.405 182729 DEBUG oslo_concurrency.lockutils [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.405 182729 DEBUG oslo_concurrency.lockutils [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.406 182729 DEBUG nova.network.neutron [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Refreshing network info cache for port 9ca80ea8-671c-4688-a4d3-26fc656e645e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.536 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.537 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.537 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.538 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.538 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.554 182729 INFO nova.compute.manager [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Terminating instance
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.572 182729 DEBUG nova.compute.manager [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:48:05 compute-0 kernel: tap9ca80ea8-67 (unregistering): left promiscuous mode
Jan 22 22:48:05 compute-0 NetworkManager[54954]: <info>  [1769122085.6021] device (tap9ca80ea8-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.617 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 ovn_controller[94850]: 2026-01-22T22:48:05Z|00655|binding|INFO|Releasing lport 9ca80ea8-671c-4688-a4d3-26fc656e645e from this chassis (sb_readonly=0)
Jan 22 22:48:05 compute-0 ovn_controller[94850]: 2026-01-22T22:48:05Z|00656|binding|INFO|Setting lport 9ca80ea8-671c-4688-a4d3-26fc656e645e down in Southbound
Jan 22 22:48:05 compute-0 kernel: tap355e0aa9-5b (unregistering): left promiscuous mode
Jan 22 22:48:05 compute-0 ovn_controller[94850]: 2026-01-22T22:48:05Z|00657|binding|INFO|Removing iface tap9ca80ea8-67 ovn-installed in OVS
Jan 22 22:48:05 compute-0 NetworkManager[54954]: <info>  [1769122085.6249] device (tap355e0aa9-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.626 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.631 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:b4:50 10.100.0.14'], port_security=['fa:16:3e:e6:b4:50 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e08a106b-5819-44ac-bcad-850c349c17cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55b79226-17c7-4623-9f19-8585aca1b119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=297b2520-2860-4872-a497-3a3478b0820d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=9ca80ea8-671c-4688-a4d3-26fc656e645e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.632 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 9ca80ea8-671c-4688-a4d3-26fc656e645e in datapath 55b79226-17c7-4623-9f19-8585aca1b119 unbound from our chassis
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.634 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55b79226-17c7-4623-9f19-8585aca1b119, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.635 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c63d87e4-8fa5-41cc-a9a4-33c42b2b6689]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.635 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 namespace which is not needed anymore
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.659 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 ovn_controller[94850]: 2026-01-22T22:48:05Z|00658|binding|INFO|Releasing lport 355e0aa9-5b7b-417a-a2e2-dea353a114d0 from this chassis (sb_readonly=0)
Jan 22 22:48:05 compute-0 ovn_controller[94850]: 2026-01-22T22:48:05Z|00659|binding|INFO|Setting lport 355e0aa9-5b7b-417a-a2e2-dea353a114d0 down in Southbound
Jan 22 22:48:05 compute-0 ovn_controller[94850]: 2026-01-22T22:48:05Z|00660|binding|INFO|Removing iface tap355e0aa9-5b ovn-installed in OVS
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.669 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.672 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:68:7d 2001:db8:0:1:f816:3eff:fe56:687d 2001:db8::f816:3eff:fe56:687d'], port_security=['fa:16:3e:56:68:7d 2001:db8:0:1:f816:3eff:fe56:687d 2001:db8::f816:3eff:fe56:687d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe56:687d/64 2001:db8::f816:3eff:fe56:687d/64', 'neutron:device_id': 'e08a106b-5819-44ac-bcad-850c349c17cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f24457d1-1f42-46ad-bdaa-d087103c906a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=355e0aa9-5b7b-417a-a2e2-dea353a114d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.689 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Jan 22 22:48:05 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000009e.scope: Consumed 15.740s CPU time.
Jan 22 22:48:05 compute-0 systemd-machined[154006]: Machine qemu-70-instance-0000009e terminated.
Jan 22 22:48:05 compute-0 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[234604]: [NOTICE]   (234608) : haproxy version is 2.8.14-c23fe91
Jan 22 22:48:05 compute-0 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[234604]: [NOTICE]   (234608) : path to executable is /usr/sbin/haproxy
Jan 22 22:48:05 compute-0 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[234604]: [WARNING]  (234608) : Exiting Master process...
Jan 22 22:48:05 compute-0 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[234604]: [ALERT]    (234608) : Current worker (234610) exited with code 143 (Terminated)
Jan 22 22:48:05 compute-0 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[234604]: [WARNING]  (234608) : All workers exited. Exiting... (0)
Jan 22 22:48:05 compute-0 systemd[1]: libpod-1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9.scope: Deactivated successfully.
Jan 22 22:48:05 compute-0 podman[235164]: 2026-01-22 22:48:05.799388576 +0000 UTC m=+0.046200161 container died 1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:48:05 compute-0 NetworkManager[54954]: <info>  [1769122085.8023] manager: (tap355e0aa9-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/303)
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.849 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9-userdata-shm.mount: Deactivated successfully.
Jan 22 22:48:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-149a517feb78e5daa9e498256278bd160695c0033672d675172f9e27e8fa25be-merged.mount: Deactivated successfully.
Jan 22 22:48:05 compute-0 podman[235164]: 2026-01-22 22:48:05.877160521 +0000 UTC m=+0.123972106 container cleanup 1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:48:05 compute-0 systemd[1]: libpod-conmon-1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9.scope: Deactivated successfully.
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.887 182729 INFO nova.virt.libvirt.driver [-] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Instance destroyed successfully.
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.887 182729 DEBUG nova.objects.instance [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid e08a106b-5819-44ac-bcad-850c349c17cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.899 182729 DEBUG nova.virt.libvirt.vif [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:46:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-668956998',display_name='tempest-TestGettingAddress-server-668956998',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-668956998',id=158,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:46:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-p89v47g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:46:56Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=e08a106b-5819-44ac-bcad-850c349c17cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.900 182729 DEBUG nova.network.os_vif_util [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.900 182729 DEBUG nova.network.os_vif_util [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e6:b4:50,bridge_name='br-int',has_traffic_filtering=True,id=9ca80ea8-671c-4688-a4d3-26fc656e645e,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ca80ea8-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.900 182729 DEBUG os_vif [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:b4:50,bridge_name='br-int',has_traffic_filtering=True,id=9ca80ea8-671c-4688-a4d3-26fc656e645e,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ca80ea8-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.902 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.902 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ca80ea8-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.903 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.906 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.908 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.911 182729 INFO os_vif [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:b4:50,bridge_name='br-int',has_traffic_filtering=True,id=9ca80ea8-671c-4688-a4d3-26fc656e645e,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ca80ea8-67')
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.911 182729 DEBUG nova.virt.libvirt.vif [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:46:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-668956998',display_name='tempest-TestGettingAddress-server-668956998',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-668956998',id=158,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:46:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-p89v47g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:46:56Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=e08a106b-5819-44ac-bcad-850c349c17cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.911 182729 DEBUG nova.network.os_vif_util [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.912 182729 DEBUG nova.network.os_vif_util [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:68:7d,bridge_name='br-int',has_traffic_filtering=True,id=355e0aa9-5b7b-417a-a2e2-dea353a114d0,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap355e0aa9-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.912 182729 DEBUG os_vif [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:68:7d,bridge_name='br-int',has_traffic_filtering=True,id=355e0aa9-5b7b-417a-a2e2-dea353a114d0,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap355e0aa9-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.913 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.913 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap355e0aa9-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.914 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.917 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.918 182729 INFO os_vif [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:68:7d,bridge_name='br-int',has_traffic_filtering=True,id=355e0aa9-5b7b-417a-a2e2-dea353a114d0,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap355e0aa9-5b')
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.919 182729 INFO nova.virt.libvirt.driver [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Deleting instance files /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf_del
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.919 182729 INFO nova.virt.libvirt.driver [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Deletion of /var/lib/nova/instances/e08a106b-5819-44ac-bcad-850c349c17cf_del complete
Jan 22 22:48:05 compute-0 podman[235218]: 2026-01-22 22:48:05.939658416 +0000 UTC m=+0.041537564 container remove 1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.944 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[521efd1e-b3af-4a80-bf5d-428c0d22ddae]: (4, ('Thu Jan 22 10:48:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 (1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9)\n1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9\nThu Jan 22 10:48:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 (1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9)\n1831e58ae6d70e88b4216ad1a933f07f1b29fdd7f38cd0779e4a07a60a5f2ba9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.945 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c699d15d-39df-4ddd-bd19-f1f2d77f9c4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.946 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55b79226-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.948 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 kernel: tap55b79226-10: left promiscuous mode
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.960 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.962 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9754d2-a714-46dd-9c1a-063841bba535]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.977 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9786a246-764e-486c-9ac3-7a745d9a7712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.978 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4857c2-855b-4a05-ac3f-efcac0398803]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.992 182729 INFO nova.compute.manager [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.993 182729 DEBUG oslo.service.loopingcall [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.993 182729 DEBUG nova.compute.manager [-] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:48:05 compute-0 nova_compute[182725]: 2026-01-22 22:48:05.994 182729 DEBUG nova.network.neutron [-] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:05.999 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1659efc1-da82-4f7f-9d64-1d53ef05bab8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565322, 'reachable_time': 21738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235234, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.001 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.002 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[e03fa384-0ce1-4733-9dae-1741cea6fb8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.002 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 355e0aa9-5b7b-417a-a2e2-dea353a114d0 in datapath 09b515c7-d044-43d4-b895-408eb5de1fd8 unbound from our chassis
Jan 22 22:48:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d55b79226\x2d17c7\x2d4623\x2d9f19\x2d8585aca1b119.mount: Deactivated successfully.
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.004 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09b515c7-d044-43d4-b895-408eb5de1fd8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.005 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3cf6b1-ac17-4e91-a646-1abb1f133d47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.005 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 namespace which is not needed anymore
Jan 22 22:48:06 compute-0 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[234678]: [NOTICE]   (234682) : haproxy version is 2.8.14-c23fe91
Jan 22 22:48:06 compute-0 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[234678]: [NOTICE]   (234682) : path to executable is /usr/sbin/haproxy
Jan 22 22:48:06 compute-0 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[234678]: [WARNING]  (234682) : Exiting Master process...
Jan 22 22:48:06 compute-0 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[234678]: [ALERT]    (234682) : Current worker (234684) exited with code 143 (Terminated)
Jan 22 22:48:06 compute-0 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[234678]: [WARNING]  (234682) : All workers exited. Exiting... (0)
Jan 22 22:48:06 compute-0 systemd[1]: libpod-e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419.scope: Deactivated successfully.
Jan 22 22:48:06 compute-0 conmon[234678]: conmon e214f3c1d9874fec8739 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419.scope/container/memory.events
Jan 22 22:48:06 compute-0 podman[235252]: 2026-01-22 22:48:06.151673042 +0000 UTC m=+0.045470463 container died e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:48:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419-userdata-shm.mount: Deactivated successfully.
Jan 22 22:48:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-1721badf8709fb832e30cddd195fe16a8b68c61607d8b228befae179d14d96e3-merged.mount: Deactivated successfully.
Jan 22 22:48:06 compute-0 podman[235252]: 2026-01-22 22:48:06.17974562 +0000 UTC m=+0.073543041 container cleanup e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:48:06 compute-0 systemd[1]: libpod-conmon-e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419.scope: Deactivated successfully.
Jan 22 22:48:06 compute-0 podman[235280]: 2026-01-22 22:48:06.243161168 +0000 UTC m=+0.044687923 container remove e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.248 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8a13185b-f29f-4c82-9f53-3c8bb5a5fcc0]: (4, ('Thu Jan 22 10:48:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 (e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419)\ne214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419\nThu Jan 22 10:48:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 (e214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419)\ne214f3c1d9874fec8739a2117f75790c4b57ff98feac9af38ced6407fae47419\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.250 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2f396cb7-0c7f-4d9a-bcfb-4827ac84157f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.251 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09b515c7-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.253 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:06 compute-0 kernel: tap09b515c7-d0: left promiscuous mode
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.264 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.266 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[75c857b7-787e-444a-9f96-a039728ec173]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.284 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[44966d39-eb82-442d-bc08-62aff3ffc0b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.285 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6b07a2f2-6996-4de7-a6c9-2488259eb80b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.302 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7daff44b-a6d1-45c6-bf02-d02fb76c4df9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565427, 'reachable_time': 22057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235298, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.305 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:48:06 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:06.305 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[73760ccc-297e-4cf1-ae59-8661443ef210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.493 182729 DEBUG nova.network.neutron [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updated VIF entry in instance network info cache for port 9ca80ea8-671c-4688-a4d3-26fc656e645e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.493 182729 DEBUG nova.network.neutron [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updating instance_info_cache with network_info: [{"id": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "address": "fa:16:3e:e6:b4:50", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ca80ea8-67", "ovs_interfaceid": "9ca80ea8-671c-4688-a4d3-26fc656e645e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "address": "fa:16:3e:56:68:7d", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe56:687d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap355e0aa9-5b", "ovs_interfaceid": "355e0aa9-5b7b-417a-a2e2-dea353a114d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.509 182729 DEBUG oslo_concurrency.lockutils [req-989bc1a6-c582-410d-9577-5b51cd66cd4d req-98a88f63-a1e2-4c5f-96de-3d47ada5bf16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e08a106b-5819-44ac-bcad-850c349c17cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.796 182729 DEBUG nova.compute.manager [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-unplugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.796 182729 DEBUG oslo_concurrency.lockutils [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.796 182729 DEBUG oslo_concurrency.lockutils [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.796 182729 DEBUG oslo_concurrency.lockutils [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.796 182729 DEBUG nova.compute.manager [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] No waiting events found dispatching network-vif-unplugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.797 182729 DEBUG nova.compute.manager [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-unplugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.797 182729 DEBUG nova.compute.manager [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.797 182729 DEBUG oslo_concurrency.lockutils [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.797 182729 DEBUG oslo_concurrency.lockutils [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.797 182729 DEBUG oslo_concurrency.lockutils [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.797 182729 DEBUG nova.compute.manager [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] No waiting events found dispatching network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.798 182729 WARNING nova.compute.manager [req-fd705e0c-173d-4a0c-a0b3-2ee42bb38738 req-beeffe0d-464b-460d-920f-d060348c97b4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received unexpected event network-vif-plugged-355e0aa9-5b7b-417a-a2e2-dea353a114d0 for instance with vm_state active and task_state deleting.
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.864 182729 DEBUG nova.network.neutron [-] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:48:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d09b515c7\x2dd044\x2d43d4\x2db895\x2d408eb5de1fd8.mount: Deactivated successfully.
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.879 182729 INFO nova.compute.manager [-] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Took 0.89 seconds to deallocate network for instance.
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.944 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:06 compute-0 nova_compute[182725]: 2026-01-22 22:48:06.944 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.006 182729 DEBUG nova.compute.provider_tree [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.026 182729 DEBUG nova.scheduler.client.report [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.048 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.076 182729 INFO nova.scheduler.client.report [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance e08a106b-5819-44ac-bcad-850c349c17cf
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.193 182729 DEBUG oslo_concurrency.lockutils [None req-2dbddfe7-0e90-446d-8fbd-c502809c2cea 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.215 182729 DEBUG nova.compute.manager [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-unplugged-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.215 182729 DEBUG oslo_concurrency.lockutils [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.216 182729 DEBUG oslo_concurrency.lockutils [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.216 182729 DEBUG oslo_concurrency.lockutils [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.217 182729 DEBUG nova.compute.manager [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] No waiting events found dispatching network-vif-unplugged-9ca80ea8-671c-4688-a4d3-26fc656e645e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.217 182729 WARNING nova.compute.manager [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received unexpected event network-vif-unplugged-9ca80ea8-671c-4688-a4d3-26fc656e645e for instance with vm_state deleted and task_state None.
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.217 182729 DEBUG nova.compute.manager [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.218 182729 DEBUG oslo_concurrency.lockutils [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.218 182729 DEBUG oslo_concurrency.lockutils [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.218 182729 DEBUG oslo_concurrency.lockutils [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e08a106b-5819-44ac-bcad-850c349c17cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.219 182729 DEBUG nova.compute.manager [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] No waiting events found dispatching network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.219 182729 WARNING nova.compute.manager [req-8569b52a-5950-4312-9ab9-aa0f2e51b960 req-e3e31e73-1c6a-4c2c-b644-c3e2c77d8c24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received unexpected event network-vif-plugged-9ca80ea8-671c-4688-a4d3-26fc656e645e for instance with vm_state deleted and task_state None.
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.703 182729 DEBUG nova.compute.manager [req-81d761ea-66d6-4794-a316-c6b77922f713 req-dc2a814c-bdc6-4a36-b630-2a07bb11aff5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-deleted-9ca80ea8-671c-4688-a4d3-26fc656e645e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:48:07 compute-0 nova_compute[182725]: 2026-01-22 22:48:07.703 182729 DEBUG nova.compute.manager [req-81d761ea-66d6-4794-a316-c6b77922f713 req-dc2a814c-bdc6-4a36-b630-2a07bb11aff5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Received event network-vif-deleted-355e0aa9-5b7b-417a-a2e2-dea353a114d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.117 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'name': 'tempest-TestSnapshotPattern-server-207363830', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a1', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a0876e1a4cab4f9997487dc31953aafd', 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'hostId': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.118 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.118 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>]
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.132 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.133 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a59465c-c44f-44a8-8252-32c95e61987f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.119053', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c602256-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.776308425, 'message_signature': 'c517715c869b5f4e2c8cdb54e5cad30146d03e8faac246a852f1882536632a62'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.119053', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c602cf6-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.776308425, 'message_signature': '7ed24e8579b24c74655e822007c663051a6db6e3db84d2899d4539884ecdd41b'}]}, 'timestamp': '2026-01-22 22:48:09.133311', '_unique_id': '7b9fd05ebd43405aad398f1486c85618'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.134 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.135 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.135 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d0dbfe7-2b8b-423e-b2a4-b90d0c7896e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.135361', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c6086f6-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.776308425, 'message_signature': '21b42cfef130eda4cfe8b7070531e63dd61ec39a7d8542fb2de73e9d119ad50b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.135361', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c608f98-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.776308425, 'message_signature': '4325f11ee9649ba589aefdc6bd91dd3a75b5944c88e07c144386b5296396b838'}]}, 'timestamp': '2026-01-22 22:48:09.135824', '_unique_id': '591b680d7a6d42389ab4a4de7ef05bc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.136 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 22:48:09 compute-0 podman[235299]: 2026-01-22 22:48:09.149046361 +0000 UTC m=+0.072339929 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.151 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/cpu volume: 10820000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4eec66c-25b9-49f0-aa2d-341623e63dbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10820000000, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'timestamp': '2026-01-22T22:48:09.136950', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '6c6315a6-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.809126592, 'message_signature': '1729a465050c434d4003de81c76582fa73a23416c5d2131dd06618292661e491'}]}, 'timestamp': '2026-01-22 22:48:09.152420', '_unique_id': '960ac5c276624063a9b35d840651c26a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.153 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.154 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.154 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>]
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.154 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.154 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>]
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.178 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.read.requests volume: 1135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.178 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b899fef5-56e4-438f-8d76-a275ab333ea0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1135, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.154892', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c671b24-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '64203da531777125853698751fe24eda51cb98825310fbb4dac8c3fc48450cf4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.154892', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c6728da-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '717ecf71c633653635639fd730cc580d35e913dc59eb04ba41fe8ff04e12bfe0'}]}, 'timestamp': '2026-01-22 22:48:09.179068', '_unique_id': 'f45d22f9f1c84fbfa8c870e4b1995174'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.179 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51d5cc12-038a-421c-8aff-9d1c6969aea1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'timestamp': '2026-01-22T22:48:09.181117', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '6c6783c0-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.809126592, 'message_signature': 'ac64a65372a0c625d8784d64c6db996af0af24199befdbe249ab4c4be2f80e4c'}]}, 'timestamp': '2026-01-22 22:48:09.181385', '_unique_id': '28684a9dbe674604971f36e5bb9d80cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.181 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.184 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cb8ac1a6-ad25-4019-add5-64c347b769cb / tapa585bd6a-28 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.184 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b475937c-57a7-4594-8b87-38c4ba7fd83a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.182926', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c681808-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '163f4f786438b227084a9d4df85a99677004854a6b58bd5632e427813b46ae79'}]}, 'timestamp': '2026-01-22 22:48:09.185198', '_unique_id': '7c405a154feb42908a85ac16c07515e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.185 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.186 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '367e2735-0672-4b52-b31f-21c97aaaf828', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.186760', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c6860ce-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': 'b1d6375cb054edb2eb441858815bc8ac79a020af413e3237ff093e53e668ead5'}]}, 'timestamp': '2026-01-22 22:48:09.187049', '_unique_id': '75c2d8e723794e75977dede649ac5807'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.187 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.188 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.outgoing.bytes volume: 1438 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0c4ce49-3d33-4e3a-afe8-ceb567c8f7eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1438, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.188456', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c68a250-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '93b301acb4d9e2828ebbf3f6362e3c7c300b12f863ea2369aab892d06111594e'}]}, 'timestamp': '2026-01-22 22:48:09.188729', '_unique_id': 'ac0128dcd071435680a16d947506a6a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.189 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.190 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.write.requests volume: 278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.190 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cc68ec2-8acb-4927-b71a-86afcbd4483c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 278, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.190422', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c68f354-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '89ac4ee0534ffff33d7426740b80ada1df6e8eac9c553c4e718bd0cce3c1ed40'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.190422', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c690272-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '23f265b1061185ec5d9c649eb8488974986d4559400dd1886efee175541b3aa0'}]}, 'timestamp': '2026-01-22 22:48:09.191233', '_unique_id': 'f02e18ab3fa34b1686213f9988ca14c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.192 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.193 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f60ea74-90eb-482e-b02b-263567eeb9b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.193250', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c695e2a-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '1e4d0e8cd2b187848f1208b74a61b0fab0f47f7e5888ba13317e1fee70e46755'}]}, 'timestamp': '2026-01-22 22:48:09.193584', '_unique_id': '7317d678c572419cb8d361d8d3835f7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.195 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbdd5e9b-f939-426a-8e36-ea1f0ca3d329', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.195086', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c69a592-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '8223ab70bb5da7a4c1542a8fe781bd3858b1ccb710c0d55637a9d4ce2b8e6dc0'}]}, 'timestamp': '2026-01-22 22:48:09.195414', '_unique_id': 'e8cd5431669b46488fb31fcf3a3d08c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.196 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.197 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-207363830>]
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.197 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.write.bytes volume: 72892416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.197 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4eefa17-65d2-4162-a46a-c0504ade7500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72892416, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.197337', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c69fdbc-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': 'c985637212371b31b01b7713098b1aae7f77296fba9047ae9fada004f3aa593c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.197337', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c6a09ba-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': 'd84f2a59204b4107b4720823c193b2a2024bb92b5a7aa9f4ac4d05cbb3854e7e'}]}, 'timestamp': '2026-01-22 22:48:09.197979', '_unique_id': '73c0667a1c82423c9bd3e527560d817f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.198 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.199 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.read.latency volume: 146395171 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.199 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.read.latency volume: 17374678 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44dee878-5a48-49f0-a4eb-2d7aeb19db6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 146395171, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.199473', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c6a50d2-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '71065471d97c16e947ff43d5b7ed35b052027534f107e19bb301cdc6fbafecb4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17374678, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.199473', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c6a5d0c-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '64f35aa3ba31dab7e967dd811d1212a3cb2a3d4968291cbaba9790529017ac74'}]}, 'timestamp': '2026-01-22 22:48:09.200092', '_unique_id': '50a562be50e14c6fb142c48f010beedf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.200 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.201 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.read.bytes volume: 31017472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.201 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee8895e6-6a72-4506-9151-eaf017d20d5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31017472, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.201655', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c6aa6b8-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '47c61a0d0e62539246533cbde372a8f822c149936f7d0c94335fbc03c56aef5f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.201655', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c6ab2e8-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '9842de610818a312fffd63548a3a4ae2065d3c5126ef7c2f7444db355dbfd96b'}]}, 'timestamp': '2026-01-22 22:48:09.202291', '_unique_id': 'eaac958d2c86451eb9440b4dd45863d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.203 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.204 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb8cc09b-c35a-437a-ac9d-08ab9e79e1af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.203916', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c6afe6a-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.776308425, 'message_signature': '03c7a907ccecf6d3ab66ec2dd46fe8757cbcd7d6e55b32b05e59f67cdf378adc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.203916', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c6b09b4-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.776308425, 'message_signature': '30453f2056598e21d7655d1b2ede4be99240ac91c9370f17a49e0e5e33cbbe16'}]}, 'timestamp': '2026-01-22 22:48:09.204510', '_unique_id': 'e2f71111a9294ae39ce41dffe114261c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.206 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.write.latency volume: 1623628497 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.206 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e65d808d-31bf-4c53-8279-d42d50f498c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1623628497, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-vda', 'timestamp': '2026-01-22T22:48:09.206045', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6c6b5162-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': '6398fbb3fcef04ded94c8f99e6ea93ef1bfd9a636dd95c8aea1c6733690ad36f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb-sda', 'timestamp': '2026-01-22T22:48:09.206045', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'instance-000000a1', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6c6b5ca2-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.812152787, 'message_signature': 'c7ed59ec292817e20b2f9643d0f81ff1d86705458dc19150ce744d0ef13c08a8'}]}, 'timestamp': '2026-01-22 22:48:09.206633', '_unique_id': '5de101008366441da3daba6b1f6698d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.207 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.208 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7787921b-364b-45a2-982f-ba81f2fe5b60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.208132', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c6ba31a-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '1059cb98b1536d394ea3cd831c7639c9d8ffa266790c76343b1a66ffa70e6c1c'}]}, 'timestamp': '2026-01-22 22:48:09.208454', '_unique_id': '068faab8e3ae45dcbff8f08346bc2cea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.209 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b344a76-9b6e-46cb-a0b0-30698d85f670', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.209907', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c6be97e-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '8ba47f52499905c9db65dde812d6ee0b00bc40f3b7d82e04e86825909f53cf55'}]}, 'timestamp': '2026-01-22 22:48:09.210256', '_unique_id': 'c40b3f028efc43ff88f8d8862f412687'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.210 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.211 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '387a6640-4957-4afb-9ecf-25fcaacae049', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.211915', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c6c36d6-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': 'c48c48b964f2bfe66636912ee12e07f0d22c88f90ff3abe43917d8846aa574ef'}]}, 'timestamp': '2026-01-22 22:48:09.212235', '_unique_id': '3000d0a14f0e4741bb5f43f17bc21993'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.212 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.213 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26b7e943-e059-4dd1-ad90-9be0e1519252', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.213674', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c6c7c22-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '8031dde4efc91f8a7450445d87f19e1cf21109b36314aaede0b74fab0d96e475'}]}, 'timestamp': '2026-01-22 22:48:09.214028', '_unique_id': '07cf5f9edc0b41489b135958d8fa53d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.214 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.215 12 DEBUG ceilometer.compute.pollsters [-] cb8ac1a6-ad25-4019-add5-64c347b769cb/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce8543f3-6ee9-412f-aec8-3a245c56ffcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_name': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_name': None, 'resource_id': 'instance-000000a1-cb8ac1a6-ad25-4019-add5-64c347b769cb-tapa585bd6a-28', 'timestamp': '2026-01-22T22:48:09.215461', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-207363830', 'name': 'tapa585bd6a-28', 'instance_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'instance_type': 'm1.nano', 'host': 'e2b01475094fba9e27c424375e7680d74867ea27046fb23989615b98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:67:da:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa585bd6a-28'}, 'message_id': '6c6cc164-f7e4-11f0-9a35-fa163e3d8874', 'monotonic_time': 5726.840210595, 'message_signature': '5608b3d15f1c8d9e4b914ce021f9f75dad007f13845d345899f5ed8aa12502c7'}]}, 'timestamp': '2026-01-22 22:48:09.215799', '_unique_id': '04eaa3d32a044d28af7c966027df61e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 22:48:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:48:09.216 12 ERROR oslo_messaging.notify.messaging 
Jan 22 22:48:09 compute-0 nova_compute[182725]: 2026-01-22 22:48:09.476 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:10 compute-0 nova_compute[182725]: 2026-01-22 22:48:10.916 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:12.456 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:12.457 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:12.458 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:13 compute-0 nova_compute[182725]: 2026-01-22 22:48:13.423 182729 DEBUG nova.compute.manager [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:48:13 compute-0 nova_compute[182725]: 2026-01-22 22:48:13.487 182729 INFO nova.compute.manager [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] instance snapshotting
Jan 22 22:48:13 compute-0 nova_compute[182725]: 2026-01-22 22:48:13.718 182729 INFO nova.virt.libvirt.driver [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Beginning live snapshot process
Jan 22 22:48:14 compute-0 virtqemud[182297]: invalid argument: disk vda does not have an active block job
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.107 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.166 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.167 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.254 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json -f qcow2" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.266 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.349 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.350 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpngqlamiy/94894b42a1aa4605a69c11cc1660a91a.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.388 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpngqlamiy/94894b42a1aa4605a69c11cc1660a91a.delta 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.389 182729 INFO nova.virt.libvirt.driver [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.443 182729 DEBUG nova.virt.libvirt.guest [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.478 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.948 182729 DEBUG nova.virt.libvirt.guest [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 22 22:48:14 compute-0 nova_compute[182725]: 2026-01-22 22:48:14.955 182729 INFO nova.virt.libvirt.driver [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 22 22:48:15 compute-0 nova_compute[182725]: 2026-01-22 22:48:15.029 182729 DEBUG nova.privsep.utils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 22:48:15 compute-0 nova_compute[182725]: 2026-01-22 22:48:15.030 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpngqlamiy/94894b42a1aa4605a69c11cc1660a91a.delta /var/lib/nova/instances/snapshots/tmpngqlamiy/94894b42a1aa4605a69c11cc1660a91a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:48:15 compute-0 nova_compute[182725]: 2026-01-22 22:48:15.479 182729 DEBUG oslo_concurrency.processutils [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpngqlamiy/94894b42a1aa4605a69c11cc1660a91a.delta /var/lib/nova/instances/snapshots/tmpngqlamiy/94894b42a1aa4605a69c11cc1660a91a" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:48:15 compute-0 nova_compute[182725]: 2026-01-22 22:48:15.493 182729 INFO nova.virt.libvirt.driver [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Snapshot extracted, beginning image upload
Jan 22 22:48:15 compute-0 nova_compute[182725]: 2026-01-22 22:48:15.919 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:16 compute-0 podman[235356]: 2026-01-22 22:48:16.179574106 +0000 UTC m=+0.109892675 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7)
Jan 22 22:48:16 compute-0 podman[235355]: 2026-01-22 22:48:16.212976118 +0000 UTC m=+0.140566539 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:48:18 compute-0 nova_compute[182725]: 2026-01-22 22:48:18.058 182729 INFO nova.virt.libvirt.driver [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Snapshot image upload complete
Jan 22 22:48:18 compute-0 nova_compute[182725]: 2026-01-22 22:48:18.059 182729 INFO nova.compute.manager [None req-ddc7016d-b4ce-42c6-994e-bbd0d509484b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Took 4.55 seconds to snapshot the instance on the hypervisor.
Jan 22 22:48:18 compute-0 ovn_controller[94850]: 2026-01-22T22:48:18Z|00661|binding|INFO|Releasing lport 941474a6-10cc-4642-b048-e5e47f4d8a09 from this chassis (sb_readonly=0)
Jan 22 22:48:18 compute-0 nova_compute[182725]: 2026-01-22 22:48:18.280 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:19 compute-0 nova_compute[182725]: 2026-01-22 22:48:19.515 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:20 compute-0 nova_compute[182725]: 2026-01-22 22:48:20.885 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122085.8842845, e08a106b-5819-44ac-bcad-850c349c17cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:48:20 compute-0 nova_compute[182725]: 2026-01-22 22:48:20.886 182729 INFO nova.compute.manager [-] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] VM Stopped (Lifecycle Event)
Jan 22 22:48:20 compute-0 nova_compute[182725]: 2026-01-22 22:48:20.905 182729 DEBUG nova.compute.manager [None req-76241bb3-b4aa-4c82-b8bc-15d8df911eae - - - - - -] [instance: e08a106b-5819-44ac-bcad-850c349c17cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:48:20 compute-0 nova_compute[182725]: 2026-01-22 22:48:20.923 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:24 compute-0 nova_compute[182725]: 2026-01-22 22:48:24.519 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:25 compute-0 nova_compute[182725]: 2026-01-22 22:48:25.902 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:25 compute-0 nova_compute[182725]: 2026-01-22 22:48:25.925 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:26 compute-0 podman[235399]: 2026-01-22 22:48:26.118725384 +0000 UTC m=+0.057726127 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:48:26 compute-0 podman[235401]: 2026-01-22 22:48:26.129564804 +0000 UTC m=+0.057229505 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:48:26 compute-0 podman[235400]: 2026-01-22 22:48:26.176748268 +0000 UTC m=+0.100554513 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent)
Jan 22 22:48:28 compute-0 nova_compute[182725]: 2026-01-22 22:48:28.275 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:29 compute-0 nova_compute[182725]: 2026-01-22 22:48:29.521 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:30 compute-0 nova_compute[182725]: 2026-01-22 22:48:30.928 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:34 compute-0 nova_compute[182725]: 2026-01-22 22:48:34.523 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:34 compute-0 nova_compute[182725]: 2026-01-22 22:48:34.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:34 compute-0 nova_compute[182725]: 2026-01-22 22:48:34.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:48:34 compute-0 nova_compute[182725]: 2026-01-22 22:48:34.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:48:35 compute-0 nova_compute[182725]: 2026-01-22 22:48:35.632 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:48:35 compute-0 nova_compute[182725]: 2026-01-22 22:48:35.633 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:48:35 compute-0 nova_compute[182725]: 2026-01-22 22:48:35.633 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:48:35 compute-0 nova_compute[182725]: 2026-01-22 22:48:35.634 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cb8ac1a6-ad25-4019-add5-64c347b769cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:48:35 compute-0 nova_compute[182725]: 2026-01-22 22:48:35.930 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:38 compute-0 nova_compute[182725]: 2026-01-22 22:48:38.194 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.546 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.926 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updating instance_info_cache with network_info: [{"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.946 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.946 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.947 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.947 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.947 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.948 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.987 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.988 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.988 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:39 compute-0 nova_compute[182725]: 2026-01-22 22:48:39.988 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.105 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:48:40 compute-0 podman[235466]: 2026-01-22 22:48:40.163917917 +0000 UTC m=+0.092959233 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.192 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.193 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.252 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.421 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.422 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5495MB free_disk=73.28826904296875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.422 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.422 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.647 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance cb8ac1a6-ad25-4019-add5-64c347b769cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.647 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.647 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.775 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:48:40 compute-0 nova_compute[182725]: 2026-01-22 22:48:40.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:43 compute-0 nova_compute[182725]: 2026-01-22 22:48:43.099 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:48:43 compute-0 nova_compute[182725]: 2026-01-22 22:48:43.190 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:48:43 compute-0 nova_compute[182725]: 2026-01-22 22:48:43.190 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:44 compute-0 nova_compute[182725]: 2026-01-22 22:48:44.131 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:44 compute-0 nova_compute[182725]: 2026-01-22 22:48:44.132 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:44 compute-0 nova_compute[182725]: 2026-01-22 22:48:44.132 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:48:44 compute-0 nova_compute[182725]: 2026-01-22 22:48:44.548 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:44 compute-0 nova_compute[182725]: 2026-01-22 22:48:44.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:45 compute-0 nova_compute[182725]: 2026-01-22 22:48:45.174 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:45 compute-0 nova_compute[182725]: 2026-01-22 22:48:45.935 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:47 compute-0 podman[235492]: 2026-01-22 22:48:47.133604179 +0000 UTC m=+0.074702240 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 22:48:47 compute-0 podman[235493]: 2026-01-22 22:48:47.134660535 +0000 UTC m=+0.072267699 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 22:48:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:47.341 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:66:79 2001:db8:0:1:f816:3eff:fe33:6679 2001:db8::f816:3eff:fe33:6679'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe33:6679/64 2001:db8::f816:3eff:fe33:6679/64', 'neutron:device_id': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30651656-9209-4f2c-a0e4-55fbbfbf46e6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=58916e1d-6812-4cb1-a469-e8a2b6c851b7) old=Port_Binding(mac=['fa:16:3e:33:66:79 2001:db8::f816:3eff:fe33:6679'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe33:6679/64', 'neutron:device_id': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:48:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:47.344 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 58916e1d-6812-4cb1-a469-e8a2b6c851b7 in datapath 7b0b7be0-dc91-4e0d-bd73-07331822edfa updated
Jan 22 22:48:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:47.347 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b0b7be0-dc91-4e0d-bd73-07331822edfa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:48:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:48:47.349 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[abef4c07-ffef-4e3e-a512-3c2cc44e06ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:48:47 compute-0 ovn_controller[94850]: 2026-01-22T22:48:47Z|00662|binding|INFO|Releasing lport 941474a6-10cc-4642-b048-e5e47f4d8a09 from this chassis (sb_readonly=0)
Jan 22 22:48:47 compute-0 nova_compute[182725]: 2026-01-22 22:48:47.590 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:49 compute-0 nova_compute[182725]: 2026-01-22 22:48:49.550 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:50 compute-0 nova_compute[182725]: 2026-01-22 22:48:50.174 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:50 compute-0 nova_compute[182725]: 2026-01-22 22:48:50.939 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:54 compute-0 nova_compute[182725]: 2026-01-22 22:48:54.552 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:55 compute-0 nova_compute[182725]: 2026-01-22 22:48:55.162 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:55 compute-0 nova_compute[182725]: 2026-01-22 22:48:55.941 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:57 compute-0 podman[235538]: 2026-01-22 22:48:57.132079422 +0000 UTC m=+0.057820940 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:48:57 compute-0 podman[235539]: 2026-01-22 22:48:57.13481641 +0000 UTC m=+0.059692036 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:48:57 compute-0 podman[235540]: 2026-01-22 22:48:57.138588664 +0000 UTC m=+0.053498822 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.554 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.889 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.889 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.890 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.890 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.890 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.891 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.936 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.936 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Image id 48dd0ec8-2856-44d4-b286-44fdc64ba78d yields fingerprint 5c84b8e4375662442ba075bd9445186e2017954e _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.936 182729 INFO nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] image 48dd0ec8-2856-44d4-b286-44fdc64ba78d at (/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e): checking
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.937 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] image 48dd0ec8-2856-44d4-b286-44fdc64ba78d at (/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.939 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.939 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] cb8ac1a6-ad25-4019-add5-64c347b769cb is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.940 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] cb8ac1a6-ad25-4019-add5-64c347b769cb has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Jan 22 22:48:59 compute-0 nova_compute[182725]: 2026-01-22 22:48:59.940 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.018 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.020 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance cb8ac1a6-ad25-4019-add5-64c347b769cb is backed by 5c84b8e4375662442ba075bd9445186e2017954e _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.021 182729 WARNING nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.021 182729 INFO nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Active base files: /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.022 182729 INFO nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Removable base files: /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.023 182729 INFO nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2de6e157e1fcab398fb030cb85fc31760486f7cc
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.024 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.024 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.025 182729 DEBUG nova.virt.libvirt.imagecache [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 22 22:49:00 compute-0 nova_compute[182725]: 2026-01-22 22:49:00.942 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:04.392 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:49:04 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:04.393 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:49:04 compute-0 nova_compute[182725]: 2026-01-22 22:49:04.393 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:04 compute-0 nova_compute[182725]: 2026-01-22 22:49:04.556 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:04 compute-0 nova_compute[182725]: 2026-01-22 22:49:04.615 182729 DEBUG nova.compute.manager [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-changed-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:04 compute-0 nova_compute[182725]: 2026-01-22 22:49:04.616 182729 DEBUG nova.compute.manager [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Refreshing instance network info cache due to event network-changed-a585bd6a-2858-42c6-a61d-e2f5ae701b22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:49:04 compute-0 nova_compute[182725]: 2026-01-22 22:49:04.617 182729 DEBUG oslo_concurrency.lockutils [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:49:04 compute-0 nova_compute[182725]: 2026-01-22 22:49:04.618 182729 DEBUG oslo_concurrency.lockutils [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:49:04 compute-0 nova_compute[182725]: 2026-01-22 22:49:04.618 182729 DEBUG nova.network.neutron [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Refreshing network info cache for port a585bd6a-2858-42c6-a61d-e2f5ae701b22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.047 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.048 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.048 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.049 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.050 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.065 182729 INFO nova.compute.manager [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Terminating instance
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.079 182729 DEBUG nova.compute.manager [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:49:05 compute-0 kernel: tapa585bd6a-28 (unregistering): left promiscuous mode
Jan 22 22:49:05 compute-0 NetworkManager[54954]: <info>  [1769122145.1146] device (tapa585bd6a-28): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:49:05 compute-0 ovn_controller[94850]: 2026-01-22T22:49:05Z|00663|binding|INFO|Releasing lport a585bd6a-2858-42c6-a61d-e2f5ae701b22 from this chassis (sb_readonly=0)
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.124 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 ovn_controller[94850]: 2026-01-22T22:49:05Z|00664|binding|INFO|Setting lport a585bd6a-2858-42c6-a61d-e2f5ae701b22 down in Southbound
Jan 22 22:49:05 compute-0 ovn_controller[94850]: 2026-01-22T22:49:05Z|00665|binding|INFO|Removing iface tapa585bd6a-28 ovn-installed in OVS
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.129 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.136 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:da:ab 10.100.0.8'], port_security=['fa:16:3e:67:da:ab 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'cb8ac1a6-ad25-4019-add5-64c347b769cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f14033-82f9-4533-a194-36532baa893b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '039e09b7-4927-4c69-bb9d-1012bf4a1d89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c5e7990-8af4-4ab4-b8e4-c75ffda3dd74, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=a585bd6a-2858-42c6-a61d-e2f5ae701b22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.138 104215 INFO neutron.agent.ovn.metadata.agent [-] Port a585bd6a-2858-42c6-a61d-e2f5ae701b22 in datapath f3f14033-82f9-4533-a194-36532baa893b unbound from our chassis
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.139 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f3f14033-82f9-4533-a194-36532baa893b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.140 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2606f23f-ada8-469c-abca-65ae9573bc82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.141 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f3f14033-82f9-4533-a194-36532baa893b namespace which is not needed anymore
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.156 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Jan 22 22:49:05 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d000000a1.scope: Consumed 14.709s CPU time.
Jan 22 22:49:05 compute-0 systemd-machined[154006]: Machine qemu-71-instance-000000a1 terminated.
Jan 22 22:49:05 compute-0 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[235046]: [NOTICE]   (235050) : haproxy version is 2.8.14-c23fe91
Jan 22 22:49:05 compute-0 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[235046]: [NOTICE]   (235050) : path to executable is /usr/sbin/haproxy
Jan 22 22:49:05 compute-0 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[235046]: [WARNING]  (235050) : Exiting Master process...
Jan 22 22:49:05 compute-0 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[235046]: [ALERT]    (235050) : Current worker (235052) exited with code 143 (Terminated)
Jan 22 22:49:05 compute-0 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[235046]: [WARNING]  (235050) : All workers exited. Exiting... (0)
Jan 22 22:49:05 compute-0 systemd[1]: libpod-beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629.scope: Deactivated successfully.
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.305 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 podman[235630]: 2026-01-22 22:49:05.309307885 +0000 UTC m=+0.055153073 container died beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.312 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629-userdata-shm.mount: Deactivated successfully.
Jan 22 22:49:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-cd4e0fa1b1e393c262bce6eebc08dc8880bfe58ba865f16b97ab7a09ba88541e-merged.mount: Deactivated successfully.
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.355 182729 INFO nova.virt.libvirt.driver [-] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Instance destroyed successfully.
Jan 22 22:49:05 compute-0 podman[235630]: 2026-01-22 22:49:05.356569641 +0000 UTC m=+0.102414799 container cleanup beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.357 182729 DEBUG nova.objects.instance [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lazy-loading 'resources' on Instance uuid cb8ac1a6-ad25-4019-add5-64c347b769cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:49:05 compute-0 systemd[1]: libpod-conmon-beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629.scope: Deactivated successfully.
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.373 182729 DEBUG nova.virt.libvirt.vif [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:47:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-207363830',display_name='tempest-TestSnapshotPattern-server-207363830',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-207363830',id=161,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKRyYAeVMNtC/j+MZtGGRG3eEEuekA15beqoQuOmHPR2UVmKb27cRtpJBpme1vKuXPPT5TKSpNW135l4FCwnjJAiPROKVyFsz2cyOtDZC0vbf3qqtMTnPoOqjT3eszeS0g==',key_name='tempest-TestSnapshotPattern-592124350',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:47:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a0876e1a4cab4f9997487dc31953aafd',ramdisk_id='',reservation_id='r-5461ovew',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-1578752051',owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:48:18Z,user_data=None,user_id='abbb13a7c01949c8b45e4e3263026c12',uuid=cb8ac1a6-ad25-4019-add5-64c347b769cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.373 182729 DEBUG nova.network.os_vif_util [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converting VIF {"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.374 182729 DEBUG nova.network.os_vif_util [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:67:da:ab,bridge_name='br-int',has_traffic_filtering=True,id=a585bd6a-2858-42c6-a61d-e2f5ae701b22,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa585bd6a-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.375 182729 DEBUG os_vif [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:67:da:ab,bridge_name='br-int',has_traffic_filtering=True,id=a585bd6a-2858-42c6-a61d-e2f5ae701b22,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa585bd6a-28') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.377 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.377 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa585bd6a-28, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.379 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.380 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.383 182729 INFO os_vif [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:67:da:ab,bridge_name='br-int',has_traffic_filtering=True,id=a585bd6a-2858-42c6-a61d-e2f5ae701b22,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa585bd6a-28')
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.384 182729 INFO nova.virt.libvirt.driver [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Deleting instance files /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb_del
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.385 182729 INFO nova.virt.libvirt.driver [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Deletion of /var/lib/nova/instances/cb8ac1a6-ad25-4019-add5-64c347b769cb_del complete
Jan 22 22:49:05 compute-0 podman[235675]: 2026-01-22 22:49:05.432427439 +0000 UTC m=+0.044199211 container remove beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.438 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[673b243c-981d-41f8-9611-c16887107d56]: (4, ('Thu Jan 22 10:49:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b (beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629)\nbeedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629\nThu Jan 22 10:49:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b (beedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629)\nbeedf892855b0aa6001c3b81d0ff0cfacf0986db6bc361a8e01b684619f71629\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.440 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[01b0b38b-6e40-4ead-86f1-188fbeafb19a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.440 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3f14033-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.442 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 kernel: tapf3f14033-80: left promiscuous mode
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.453 182729 INFO nova.compute.manager [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.454 182729 DEBUG oslo.service.loopingcall [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.455 182729 DEBUG nova.compute.manager [-] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.455 182729 DEBUG nova.network.neutron [-] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:49:05 compute-0 nova_compute[182725]: 2026-01-22 22:49:05.461 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.462 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c32ad202-f1cd-412c-9149-6921317baceb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.486 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[41f3a83d-5eb1-432d-960b-0e78435a03dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.488 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[975a68f4-1222-4c44-945c-c6ddb6dea8a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.505 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[649fe7a7-6f7d-4345-b44a-0985a019ee9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570681, 'reachable_time': 22665, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235690, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.507 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f3f14033-82f9-4533-a194-36532baa893b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:49:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:05.507 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a9498694-bd90-4534-9b3b-4511fdb15a0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:05 compute-0 systemd[1]: run-netns-ovnmeta\x2df3f14033\x2d82f9\x2d4533\x2da194\x2d36532baa893b.mount: Deactivated successfully.
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.028 182729 DEBUG nova.network.neutron [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updated VIF entry in instance network info cache for port a585bd6a-2858-42c6-a61d-e2f5ae701b22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.029 182729 DEBUG nova.network.neutron [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updating instance_info_cache with network_info: [{"id": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "address": "fa:16:3e:67:da:ab", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa585bd6a-28", "ovs_interfaceid": "a585bd6a-2858-42c6-a61d-e2f5ae701b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.060 182729 DEBUG oslo_concurrency.lockutils [req-6d7890cd-2a20-4a17-b362-1ff674f329e7 req-f30648d3-311b-4656-9be8-6a2e93f00853 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-cb8ac1a6-ad25-4019-add5-64c347b769cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.106 182729 DEBUG nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-vif-unplugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.106 182729 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.107 182729 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.107 182729 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.107 182729 DEBUG nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] No waiting events found dispatching network-vif-unplugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.108 182729 DEBUG nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-vif-unplugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.108 182729 DEBUG nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.108 182729 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.109 182729 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.109 182729 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.109 182729 DEBUG nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] No waiting events found dispatching network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.109 182729 WARNING nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received unexpected event network-vif-plugged-a585bd6a-2858-42c6-a61d-e2f5ae701b22 for instance with vm_state active and task_state deleting.
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.158 182729 DEBUG nova.network.neutron [-] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.187 182729 INFO nova.compute.manager [-] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Took 1.73 seconds to deallocate network for instance.
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.260 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.260 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.324 182729 DEBUG nova.compute.provider_tree [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.343 182729 DEBUG nova.scheduler.client.report [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.380 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:07 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:07.395 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.413 182729 INFO nova.scheduler.client.report [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Deleted allocations for instance cb8ac1a6-ad25-4019-add5-64c347b769cb
Jan 22 22:49:07 compute-0 nova_compute[182725]: 2026-01-22 22:49:07.494 182729 DEBUG oslo_concurrency.lockutils [None req-0ad5bc9c-37d8-4cf7-bb98-6bfd28645c9b abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "cb8ac1a6-ad25-4019-add5-64c347b769cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:08 compute-0 nova_compute[182725]: 2026-01-22 22:49:08.869 182729 DEBUG nova.compute.manager [req-8c0628df-3d28-42d3-828c-b3ad964f4909 req-ffb6e775-bb1a-486b-8a50-7567ea712be8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Received event network-vif-deleted-a585bd6a-2858-42c6-a61d-e2f5ae701b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:09 compute-0 nova_compute[182725]: 2026-01-22 22:49:09.558 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:10 compute-0 nova_compute[182725]: 2026-01-22 22:49:10.380 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:11 compute-0 podman[235691]: 2026-01-22 22:49:11.16992091 +0000 UTC m=+0.094081972 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 22:49:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:12.458 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:12.458 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:12.459 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:13 compute-0 nova_compute[182725]: 2026-01-22 22:49:13.732 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:13 compute-0 nova_compute[182725]: 2026-01-22 22:49:13.950 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:14 compute-0 nova_compute[182725]: 2026-01-22 22:49:14.581 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:15 compute-0 nova_compute[182725]: 2026-01-22 22:49:15.384 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:18 compute-0 podman[235715]: 2026-01-22 22:49:18.192956857 +0000 UTC m=+0.114898120 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Jan 22 22:49:18 compute-0 podman[235714]: 2026-01-22 22:49:18.202224667 +0000 UTC m=+0.136662361 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 22:49:19 compute-0 nova_compute[182725]: 2026-01-22 22:49:19.584 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:20 compute-0 nova_compute[182725]: 2026-01-22 22:49:20.355 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122145.3527107, cb8ac1a6-ad25-4019-add5-64c347b769cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:49:20 compute-0 nova_compute[182725]: 2026-01-22 22:49:20.357 182729 INFO nova.compute.manager [-] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] VM Stopped (Lifecycle Event)
Jan 22 22:49:20 compute-0 nova_compute[182725]: 2026-01-22 22:49:20.386 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:20 compute-0 nova_compute[182725]: 2026-01-22 22:49:20.400 182729 DEBUG nova.compute.manager [None req-97619529-995f-4773-afc6-f2e86ac8492a - - - - - -] [instance: cb8ac1a6-ad25-4019-add5-64c347b769cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:49:24 compute-0 nova_compute[182725]: 2026-01-22 22:49:24.585 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:25 compute-0 nova_compute[182725]: 2026-01-22 22:49:25.389 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:27 compute-0 nova_compute[182725]: 2026-01-22 22:49:27.020 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:28 compute-0 podman[235759]: 2026-01-22 22:49:28.114551854 +0000 UTC m=+0.051366529 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:49:28 compute-0 podman[235760]: 2026-01-22 22:49:28.118498702 +0000 UTC m=+0.051705007 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 22 22:49:28 compute-0 podman[235761]: 2026-01-22 22:49:28.145539225 +0000 UTC m=+0.074493474 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:49:29 compute-0 nova_compute[182725]: 2026-01-22 22:49:29.586 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:30 compute-0 nova_compute[182725]: 2026-01-22 22:49:30.391 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:30 compute-0 nova_compute[182725]: 2026-01-22 22:49:30.956 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:30 compute-0 nova_compute[182725]: 2026-01-22 22:49:30.956 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:30 compute-0 nova_compute[182725]: 2026-01-22 22:49:30.975 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.259 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.259 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.267 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.267 182729 INFO nova.compute.claims [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.494 182729 DEBUG nova.compute.provider_tree [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.507 182729 DEBUG nova.scheduler.client.report [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.527 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.527 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.577 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.578 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.596 182729 INFO nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.633 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.880 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.881 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.882 182729 INFO nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Creating image(s)
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.882 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.883 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.883 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.895 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.973 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.975 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.975 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:31 compute-0 nova_compute[182725]: 2026-01-22 22:49:31.986 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.043 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.044 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.078 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.080 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.080 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.136 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.138 182729 DEBUG nova.virt.disk.api [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.138 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.166 182729 DEBUG nova.policy [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.194 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.195 182729 DEBUG nova.virt.disk.api [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.195 182729 DEBUG nova.objects.instance [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid faed94a0-9d35-4cef-9a93-8676494aefb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.209 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.209 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Ensure instance console log exists: /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.210 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.210 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:32 compute-0 nova_compute[182725]: 2026-01-22 22:49:32.210 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:33 compute-0 nova_compute[182725]: 2026-01-22 22:49:33.521 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Successfully created port: f0a86929-bba0-4a5d-9a12-6ac73816e0b3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:49:33 compute-0 nova_compute[182725]: 2026-01-22 22:49:33.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:33 compute-0 nova_compute[182725]: 2026-01-22 22:49:33.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:49:34 compute-0 nova_compute[182725]: 2026-01-22 22:49:34.065 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Successfully created port: c90439e7-3e16-4e66-bbc5-02906db93e08 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:49:34 compute-0 nova_compute[182725]: 2026-01-22 22:49:34.587 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.203 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Successfully updated port: f0a86929-bba0-4a5d-9a12-6ac73816e0b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.374 182729 DEBUG nova.compute.manager [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-changed-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.375 182729 DEBUG nova.compute.manager [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing instance network info cache due to event network-changed-f0a86929-bba0-4a5d-9a12-6ac73816e0b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.375 182729 DEBUG oslo_concurrency.lockutils [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.376 182729 DEBUG oslo_concurrency.lockutils [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.376 182729 DEBUG nova.network.neutron [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing network info cache for port f0a86929-bba0-4a5d-9a12-6ac73816e0b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.439 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.671 182729 DEBUG nova.network.neutron [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.948 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.948 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.948 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.964 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 22:49:35 compute-0 nova_compute[182725]: 2026-01-22 22:49:35.965 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.906 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.907 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.907 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.907 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.964 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Successfully updated port: c90439e7-3e16-4e66-bbc5-02906db93e08 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.981 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:49:36 compute-0 nova_compute[182725]: 2026-01-22 22:49:36.990 182729 DEBUG nova.network.neutron [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.013 182729 DEBUG oslo_concurrency.lockutils [req-73661690-61ae-4dc4-a8f3-a6510634508b req-40707781-fdf6-4b3c-9341-5d97a754f8c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.014 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.014 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.095 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.096 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=73.31677627563477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.096 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.096 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.201 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance faed94a0-9d35-4cef-9a93-8676494aefb6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.201 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.201 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.248 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.268 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.290 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.291 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.362 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.710 182729 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-changed-c90439e7-3e16-4e66-bbc5-02906db93e08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.711 182729 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing instance network info cache due to event network-changed-c90439e7-3e16-4e66-bbc5-02906db93e08. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:49:37 compute-0 nova_compute[182725]: 2026-01-22 22:49:37.711 182729 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:49:39 compute-0 nova_compute[182725]: 2026-01-22 22:49:39.290 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:39 compute-0 nova_compute[182725]: 2026-01-22 22:49:39.290 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:39 compute-0 nova_compute[182725]: 2026-01-22 22:49:39.588 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.224 182729 DEBUG nova.network.neutron [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updating instance_info_cache with network_info: [{"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.249 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.249 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Instance network_info: |[{"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.250 182729 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.250 182729 DEBUG nova.network.neutron [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing network info cache for port c90439e7-3e16-4e66-bbc5-02906db93e08 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.257 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Start _get_guest_xml network_info=[{"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.262 182729 WARNING nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.267 182729 DEBUG nova.virt.libvirt.host [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.268 182729 DEBUG nova.virt.libvirt.host [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.276 182729 DEBUG nova.virt.libvirt.host [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.277 182729 DEBUG nova.virt.libvirt.host [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.279 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.279 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.280 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.281 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.281 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.282 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.282 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.282 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.283 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.283 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.284 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.284 182729 DEBUG nova.virt.hardware [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.290 182729 DEBUG nova.virt.libvirt.vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:49:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1483093445',display_name='tempest-TestGettingAddress-server-1483093445',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1483093445',id=166,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVBrWhrWV2+OK0B0KsdlZt/7BxoMVzT16QtwS4PWoBh6Eqvfi8VtswQVJGTO+qQyjO+HTEKK/1qaqZ9dH7Hxk9McYjzl5QMixkUqAUPMLu8ir8hdly+vRL9+JEZ9ZwGOg==',key_name='tempest-TestGettingAddress-1359973904',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-z3qtw9st',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:49:31Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=faed94a0-9d35-4cef-9a93-8676494aefb6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.291 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.292 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c2:ee,bridge_name='br-int',has_traffic_filtering=True,id=f0a86929-bba0-4a5d-9a12-6ac73816e0b3,network=Network(1e744b0e-1559-4ac7-9396-bcabe1d9688d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0a86929-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.293 182729 DEBUG nova.virt.libvirt.vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:49:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1483093445',display_name='tempest-TestGettingAddress-server-1483093445',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1483093445',id=166,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVBrWhrWV2+OK0B0KsdlZt/7BxoMVzT16QtwS4PWoBh6Eqvfi8VtswQVJGTO+qQyjO+HTEKK/1qaqZ9dH7Hxk9McYjzl5QMixkUqAUPMLu8ir8hdly+vRL9+JEZ9ZwGOg==',key_name='tempest-TestGettingAddress-1359973904',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-z3qtw9st',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:49:31Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=faed94a0-9d35-4cef-9a93-8676494aefb6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.294 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.295 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:14:85,bridge_name='br-int',has_traffic_filtering=True,id=c90439e7-3e16-4e66-bbc5-02906db93e08,network=Network(7b0b7be0-dc91-4e0d-bd73-07331822edfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc90439e7-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.297 182729 DEBUG nova.objects.instance [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid faed94a0-9d35-4cef-9a93-8676494aefb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.312 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <uuid>faed94a0-9d35-4cef-9a93-8676494aefb6</uuid>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <name>instance-000000a6</name>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <nova:name>tempest-TestGettingAddress-server-1483093445</nova:name>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:49:40</nova:creationTime>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:port uuid="f0a86929-bba0-4a5d-9a12-6ac73816e0b3">
Jan 22 22:49:40 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         <nova:port uuid="c90439e7-3e16-4e66-bbc5-02906db93e08">
Jan 22 22:49:40 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe97:1485" ipVersion="6"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe97:1485" ipVersion="6"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <system>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <entry name="serial">faed94a0-9d35-4cef-9a93-8676494aefb6</entry>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <entry name="uuid">faed94a0-9d35-4cef-9a93-8676494aefb6</entry>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </system>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <os>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   </os>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <features>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   </features>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk.config"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:d6:c2:ee"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <target dev="tapf0a86929-bb"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:97:14:85"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <target dev="tapc90439e7-3e"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/console.log" append="off"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <video>
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </video>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:49:40 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:49:40 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:49:40 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:49:40 compute-0 nova_compute[182725]: </domain>
Jan 22 22:49:40 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.314 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Preparing to wait for external event network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.315 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.315 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.315 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.316 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Preparing to wait for external event network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.316 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.316 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.316 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.317 182729 DEBUG nova.virt.libvirt.vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:49:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1483093445',display_name='tempest-TestGettingAddress-server-1483093445',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1483093445',id=166,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVBrWhrWV2+OK0B0KsdlZt/7BxoMVzT16QtwS4PWoBh6Eqvfi8VtswQVJGTO+qQyjO+HTEKK/1qaqZ9dH7Hxk9McYjzl5QMixkUqAUPMLu8ir8hdly+vRL9+JEZ9ZwGOg==',key_name='tempest-TestGettingAddress-1359973904',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-z3qtw9st',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:49:31Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=faed94a0-9d35-4cef-9a93-8676494aefb6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.317 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.318 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c2:ee,bridge_name='br-int',has_traffic_filtering=True,id=f0a86929-bba0-4a5d-9a12-6ac73816e0b3,network=Network(1e744b0e-1559-4ac7-9396-bcabe1d9688d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0a86929-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.318 182729 DEBUG os_vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c2:ee,bridge_name='br-int',has_traffic_filtering=True,id=f0a86929-bba0-4a5d-9a12-6ac73816e0b3,network=Network(1e744b0e-1559-4ac7-9396-bcabe1d9688d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0a86929-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.319 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.319 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.319 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.323 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0a86929-bb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.323 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0a86929-bb, col_values=(('external_ids', {'iface-id': 'f0a86929-bba0-4a5d-9a12-6ac73816e0b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:c2:ee', 'vm-uuid': 'faed94a0-9d35-4cef-9a93-8676494aefb6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.325 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 NetworkManager[54954]: <info>  [1769122180.3264] manager: (tapf0a86929-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.328 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.334 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.336 182729 INFO os_vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c2:ee,bridge_name='br-int',has_traffic_filtering=True,id=f0a86929-bba0-4a5d-9a12-6ac73816e0b3,network=Network(1e744b0e-1559-4ac7-9396-bcabe1d9688d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0a86929-bb')
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.338 182729 DEBUG nova.virt.libvirt.vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:49:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1483093445',display_name='tempest-TestGettingAddress-server-1483093445',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1483093445',id=166,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVBrWhrWV2+OK0B0KsdlZt/7BxoMVzT16QtwS4PWoBh6Eqvfi8VtswQVJGTO+qQyjO+HTEKK/1qaqZ9dH7Hxk9McYjzl5QMixkUqAUPMLu8ir8hdly+vRL9+JEZ9ZwGOg==',key_name='tempest-TestGettingAddress-1359973904',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-z3qtw9st',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:49:31Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=faed94a0-9d35-4cef-9a93-8676494aefb6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.339 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.340 182729 DEBUG nova.network.os_vif_util [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:14:85,bridge_name='br-int',has_traffic_filtering=True,id=c90439e7-3e16-4e66-bbc5-02906db93e08,network=Network(7b0b7be0-dc91-4e0d-bd73-07331822edfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc90439e7-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.341 182729 DEBUG os_vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:14:85,bridge_name='br-int',has_traffic_filtering=True,id=c90439e7-3e16-4e66-bbc5-02906db93e08,network=Network(7b0b7be0-dc91-4e0d-bd73-07331822edfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc90439e7-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.342 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.342 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.343 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.345 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.346 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc90439e7-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.347 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc90439e7-3e, col_values=(('external_ids', {'iface-id': 'c90439e7-3e16-4e66-bbc5-02906db93e08', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:14:85', 'vm-uuid': 'faed94a0-9d35-4cef-9a93-8676494aefb6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.349 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 NetworkManager[54954]: <info>  [1769122180.3497] manager: (tapc90439e7-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.352 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.359 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.360 182729 INFO os_vif [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:14:85,bridge_name='br-int',has_traffic_filtering=True,id=c90439e7-3e16-4e66-bbc5-02906db93e08,network=Network(7b0b7be0-dc91-4e0d-bd73-07331822edfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc90439e7-3e')
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.432 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.433 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.433 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:d6:c2:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.433 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:97:14:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.434 182729 INFO nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Using config drive
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.958 182729 INFO nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Creating config drive at /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk.config
Jan 22 22:49:40 compute-0 nova_compute[182725]: 2026-01-22 22:49:40.963 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb4ghixkj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.100 182729 DEBUG oslo_concurrency.processutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb4ghixkj" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.1711] manager: (tapf0a86929-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/306)
Jan 22 22:49:41 compute-0 kernel: tapf0a86929-bb: entered promiscuous mode
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.175 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00666|binding|INFO|Claiming lport f0a86929-bba0-4a5d-9a12-6ac73816e0b3 for this chassis.
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00667|binding|INFO|f0a86929-bba0-4a5d-9a12-6ac73816e0b3: Claiming fa:16:3e:d6:c2:ee 10.100.0.5
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.1841] manager: (tapc90439e7-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/307)
Jan 22 22:49:41 compute-0 kernel: tapc90439e7-3e: entered promiscuous mode
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.185 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.188 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.190 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.1909] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.1915] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00668|if_status|INFO|Not updating pb chassis for c90439e7-3e16-4e66-bbc5-02906db93e08 now as sb is readonly
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.195 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c2:ee 10.100.0.5'], port_security=['fa:16:3e:d6:c2:ee 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'faed94a0-9d35-4cef-9a93-8676494aefb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a73a41f0-812c-4e4a-8c47-e2095e6414a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43064b03-af4b-4784-84e9-32242d15fee5, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=f0a86929-bba0-4a5d-9a12-6ac73816e0b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.196 104215 INFO neutron.agent.ovn.metadata.agent [-] Port f0a86929-bba0-4a5d-9a12-6ac73816e0b3 in datapath 1e744b0e-1559-4ac7-9396-bcabe1d9688d bound to our chassis
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.197 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1e744b0e-1559-4ac7-9396-bcabe1d9688d
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.210 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2d54f0a8-186c-4c4c-aa33-5eed1d2b48dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.211 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1e744b0e-11 in ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.214 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1e744b0e-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.214 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[499128dc-1828-4943-8458-d8b7443c8dbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.216 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b3e938-2ad5-445c-a488-ec3847682bc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.232 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[1a16ec76-cc45-4854-96b8-99d4de75e5d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 systemd-udevd[235881]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:49:41 compute-0 systemd-machined[154006]: New machine qemu-72-instance-000000a6.
Jan 22 22:49:41 compute-0 systemd-udevd[235884]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:49:41 compute-0 systemd[1]: Started Virtual Machine qemu-72-instance-000000a6.
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.2650] device (tapf0a86929-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.2671] device (tapf0a86929-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.2715] device (tapc90439e7-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.2721] device (tapc90439e7-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.279 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[09aefe65-833e-430f-afd2-c9e1c6b98b82]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 podman[235863]: 2026-01-22 22:49:41.285260028 +0000 UTC m=+0.080933865 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.302 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.309 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[02098794-bbdc-47cb-a75f-158c98b55a4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00669|binding|INFO|Claiming lport c90439e7-3e16-4e66-bbc5-02906db93e08 for this chassis.
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00670|binding|INFO|c90439e7-3e16-4e66-bbc5-02906db93e08: Claiming fa:16:3e:97:14:85 2001:db8:0:1:f816:3eff:fe97:1485 2001:db8::f816:3eff:fe97:1485
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.3172] manager: (tap1e744b0e-10): new Veth device (/org/freedesktop/NetworkManager/Devices/310)
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.316 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aeafd069-2d82-499a-a174-90bf03d42cbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00671|binding|INFO|Setting lport f0a86929-bba0-4a5d-9a12-6ac73816e0b3 ovn-installed in OVS
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00672|binding|INFO|Setting lport f0a86929-bba0-4a5d-9a12-6ac73816e0b3 up in Southbound
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.325 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:14:85 2001:db8:0:1:f816:3eff:fe97:1485 2001:db8::f816:3eff:fe97:1485'], port_security=['fa:16:3e:97:14:85 2001:db8:0:1:f816:3eff:fe97:1485 2001:db8::f816:3eff:fe97:1485'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe97:1485/64 2001:db8::f816:3eff:fe97:1485/64', 'neutron:device_id': 'faed94a0-9d35-4cef-9a93-8676494aefb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a73a41f0-812c-4e4a-8c47-e2095e6414a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30651656-9209-4f2c-a0e4-55fbbfbf46e6, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c90439e7-3e16-4e66-bbc5-02906db93e08) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.320 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00673|binding|INFO|Setting lport c90439e7-3e16-4e66-bbc5-02906db93e08 ovn-installed in OVS
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.340 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.355 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[89adc301-2dcb-4e5e-9187-3207f76e3b29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.358 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9d5a0192-1d9d-47b0-b071-6b82adf14a96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00674|binding|INFO|Setting lport c90439e7-3e16-4e66-bbc5-02906db93e08 up in Southbound
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.3784] device (tap1e744b0e-10): carrier: link connected
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.384 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d73ca847-9c45-4ed3-9b68-0fe43852e3c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.400 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5640bb16-7c61-400d-a667-e1e1a0c91645]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e744b0e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:1d:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581898, 'reachable_time': 30840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235920, 'error': None, 'target': 'ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.419 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a403f3dc-55e7-41e0-848f-e0cb1edfd1aa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:1d8c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581898, 'tstamp': 581898}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235921, 'error': None, 'target': 'ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.437 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[74cee9fe-1199-4931-b462-c859648adee4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e744b0e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:1d:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581898, 'reachable_time': 30840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235922, 'error': None, 'target': 'ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.481 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[538dfe5e-2c29-4a94-be03-cf24b3af3f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.557 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[963c7fb0-9dbf-4221-96bc-1843c38a4c01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.559 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e744b0e-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.560 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.560 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e744b0e-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:41 compute-0 kernel: tap1e744b0e-10: entered promiscuous mode
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.563 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 NetworkManager[54954]: <info>  [1769122181.5669] manager: (tap1e744b0e-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.571 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1e744b0e-10, col_values=(('external_ids', {'iface-id': '841b4933-4305-4690-bcb6-36e7d77197d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.573 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 ovn_controller[94850]: 2026-01-22T22:49:41Z|00675|binding|INFO|Releasing lport 841b4933-4305-4690-bcb6-36e7d77197d4 from this chassis (sb_readonly=0)
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.574 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.575 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1e744b0e-1559-4ac7-9396-bcabe1d9688d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1e744b0e-1559-4ac7-9396-bcabe1d9688d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.577 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5191f9f6-fded-4fa1-a97b-8eb31038cbd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.578 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-1e744b0e-1559-4ac7-9396-bcabe1d9688d
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/1e744b0e-1559-4ac7-9396-bcabe1d9688d.pid.haproxy
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 1e744b0e-1559-4ac7-9396-bcabe1d9688d
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:49:41 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:41.580 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'env', 'PROCESS_TAG=haproxy-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1e744b0e-1559-4ac7-9396-bcabe1d9688d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.590 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.742 182729 DEBUG nova.compute.manager [req-213cb479-b537-433b-8036-c7955c515a0d req-59050a56-5e0e-45b4-9b39-64bc7f0bc482 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.742 182729 DEBUG oslo_concurrency.lockutils [req-213cb479-b537-433b-8036-c7955c515a0d req-59050a56-5e0e-45b4-9b39-64bc7f0bc482 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.743 182729 DEBUG oslo_concurrency.lockutils [req-213cb479-b537-433b-8036-c7955c515a0d req-59050a56-5e0e-45b4-9b39-64bc7f0bc482 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.743 182729 DEBUG oslo_concurrency.lockutils [req-213cb479-b537-433b-8036-c7955c515a0d req-59050a56-5e0e-45b4-9b39-64bc7f0bc482 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.743 182729 DEBUG nova.compute.manager [req-213cb479-b537-433b-8036-c7955c515a0d req-59050a56-5e0e-45b4-9b39-64bc7f0bc482 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Processing event network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.922 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122181.9209454, faed94a0-9d35-4cef-9a93-8676494aefb6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.922 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] VM Started (Lifecycle Event)
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.932 182729 DEBUG nova.compute.manager [req-612f1a37-60f6-4948-bc86-7f291075b84e req-9a9fbb60-4108-41d6-8c92-5572c3608e1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.933 182729 DEBUG oslo_concurrency.lockutils [req-612f1a37-60f6-4948-bc86-7f291075b84e req-9a9fbb60-4108-41d6-8c92-5572c3608e1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.933 182729 DEBUG oslo_concurrency.lockutils [req-612f1a37-60f6-4948-bc86-7f291075b84e req-9a9fbb60-4108-41d6-8c92-5572c3608e1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.933 182729 DEBUG oslo_concurrency.lockutils [req-612f1a37-60f6-4948-bc86-7f291075b84e req-9a9fbb60-4108-41d6-8c92-5572c3608e1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.933 182729 DEBUG nova.compute.manager [req-612f1a37-60f6-4948-bc86-7f291075b84e req-9a9fbb60-4108-41d6-8c92-5572c3608e1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Processing event network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.934 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.944 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.948 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.949 182729 INFO nova.virt.libvirt.driver [-] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Instance spawned successfully.
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.950 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.952 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.971 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.971 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122181.9243639, faed94a0-9d35-4cef-9a93-8676494aefb6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.972 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] VM Paused (Lifecycle Event)
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.975 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.976 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.976 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.977 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.977 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:49:41 compute-0 nova_compute[182725]: 2026-01-22 22:49:41.977 182729 DEBUG nova.virt.libvirt.driver [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.006 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.010 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122181.943312, faed94a0-9d35-4cef-9a93-8676494aefb6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.010 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] VM Resumed (Lifecycle Event)
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.040 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.045 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.054 182729 INFO nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Took 10.17 seconds to spawn the instance on the hypervisor.
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.055 182729 DEBUG nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:49:42 compute-0 podman[235961]: 2026-01-22 22:49:42.055595866 +0000 UTC m=+0.062298691 container create e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.067 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:49:42 compute-0 systemd[1]: Started libpod-conmon-e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88.scope.
Jan 22 22:49:42 compute-0 ovn_controller[94850]: 2026-01-22T22:49:42Z|00676|binding|INFO|Releasing lport 841b4933-4305-4690-bcb6-36e7d77197d4 from this chassis (sb_readonly=0)
Jan 22 22:49:42 compute-0 podman[235961]: 2026-01-22 22:49:42.023046626 +0000 UTC m=+0.029749541 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:49:42 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:49:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15efc2eea3cf02bd146df56b54715d437178c26e59888e7dc7e405a6a923dfb4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.168 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:42 compute-0 podman[235961]: 2026-01-22 22:49:42.172420292 +0000 UTC m=+0.179123207 container init e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 22:49:42 compute-0 podman[235961]: 2026-01-22 22:49:42.178646556 +0000 UTC m=+0.185349421 container start e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.203 182729 INFO nova.compute.manager [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Took 11.17 seconds to build instance.
Jan 22 22:49:42 compute-0 neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d[235975]: [NOTICE]   (235979) : New worker (235981) forked
Jan 22 22:49:42 compute-0 neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d[235975]: [NOTICE]   (235979) : Loading success.
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.239 182729 DEBUG oslo_concurrency.lockutils [None req-ad48744c-4fa9-4658-8c07-85571b59f757 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.251 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c90439e7-3e16-4e66-bbc5-02906db93e08 in datapath 7b0b7be0-dc91-4e0d-bd73-07331822edfa unbound from our chassis
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.253 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b0b7be0-dc91-4e0d-bd73-07331822edfa
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.265 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[42ff39a9-cb1a-482e-8b1b-07285daaa325]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.266 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b0b7be0-d1 in ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.271 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b0b7be0-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.271 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8f053603-f5ad-46cf-b004-11887745a6e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.272 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0725e245-1139-449b-99a7-fb934beb9a81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.283 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4b966a31-019c-4eab-9557-61d2f4fb9aa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.306 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c639c65c-9aa2-4bb4-86ed-ae31b24026c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.334 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[afb8d6a5-0575-4fb4-bf88-3cf2d67d0d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 NetworkManager[54954]: <info>  [1769122182.3405] manager: (tap7b0b7be0-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/312)
Jan 22 22:49:42 compute-0 systemd-udevd[235910]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.339 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cfaeda09-999f-4e0c-80ea-b68266ad9621]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.378 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc185db-5672-46b4-ac0a-f001a4164e16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.381 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f87aee86-087f-4b59-892c-77ee31db3a93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 NetworkManager[54954]: <info>  [1769122182.4095] device (tap7b0b7be0-d0): carrier: link connected
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.417 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a6c76a-6a82-41f2-8984-965a26253e9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.436 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[20df662c-8be6-4446-b3cb-31a44899b430]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b0b7be0-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:66:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582001, 'reachable_time': 30875, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236000, 'error': None, 'target': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.457 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[db4d86d0-ac9e-4a31-8133-1a679034e850]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:6679'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 582001, 'tstamp': 582001}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236001, 'error': None, 'target': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.482 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8702ac-cded-482b-96ca-0972387b961f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b0b7be0-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:66:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582001, 'reachable_time': 30875, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236002, 'error': None, 'target': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.532 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0a3ceb-3e3e-4fe4-b8c3-8732f3495406]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.541 182729 DEBUG nova.network.neutron [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updated VIF entry in instance network info cache for port c90439e7-3e16-4e66-bbc5-02906db93e08. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.542 182729 DEBUG nova.network.neutron [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updating instance_info_cache with network_info: [{"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.556 182729 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.572 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a7619bc8-3576-4c66-9977-ddf12ce1cd9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.573 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b0b7be0-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.574 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.574 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b0b7be0-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:42 compute-0 kernel: tap7b0b7be0-d0: entered promiscuous mode
Jan 22 22:49:42 compute-0 NetworkManager[54954]: <info>  [1769122182.5772] manager: (tap7b0b7be0-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.579 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b0b7be0-d0, col_values=(('external_ids', {'iface-id': '58916e1d-6812-4cb1-a469-e8a2b6c851b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:49:42 compute-0 ovn_controller[94850]: 2026-01-22T22:49:42Z|00677|binding|INFO|Releasing lport 58916e1d-6812-4cb1-a469-e8a2b6c851b7 from this chassis (sb_readonly=0)
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.584 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.584 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b0b7be0-dc91-4e0d-bd73-07331822edfa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b0b7be0-dc91-4e0d-bd73-07331822edfa.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.587 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[dd98416b-7aa1-4772-881f-f0cd9b716f6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.587 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-7b0b7be0-dc91-4e0d-bd73-07331822edfa
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/7b0b7be0-dc91-4e0d-bd73-07331822edfa.pid.haproxy
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 7b0b7be0-dc91-4e0d-bd73-07331822edfa
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:49:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:49:42.588 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'env', 'PROCESS_TAG=haproxy-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b0b7be0-dc91-4e0d-bd73-07331822edfa.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:49:42 compute-0 nova_compute[182725]: 2026-01-22 22:49:42.600 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:42 compute-0 podman[236033]: 2026-01-22 22:49:42.945046236 +0000 UTC m=+0.051693117 container create d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 22:49:42 compute-0 systemd[1]: Started libpod-conmon-d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3.scope.
Jan 22 22:49:43 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:49:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e26effa7f7b862215acdbd88b61c092c840ed2761ddaa45e4c0ca6b7f8b59b13/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:49:43 compute-0 podman[236033]: 2026-01-22 22:49:42.915963913 +0000 UTC m=+0.022610844 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:49:43 compute-0 podman[236033]: 2026-01-22 22:49:43.383151197 +0000 UTC m=+0.489798098 container init d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 22:49:43 compute-0 podman[236033]: 2026-01-22 22:49:43.388529591 +0000 UTC m=+0.495176472 container start d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:49:43 compute-0 neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa[236049]: [NOTICE]   (236053) : New worker (236055) forked
Jan 22 22:49:43 compute-0 neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa[236049]: [NOTICE]   (236053) : Loading success.
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.953 182729 DEBUG nova.compute.manager [req-7a0be131-fdba-4869-9fbc-01baabe20cf4 req-33c0c93f-bf73-4a46-8759-78c393990c63 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.954 182729 DEBUG oslo_concurrency.lockutils [req-7a0be131-fdba-4869-9fbc-01baabe20cf4 req-33c0c93f-bf73-4a46-8759-78c393990c63 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.954 182729 DEBUG oslo_concurrency.lockutils [req-7a0be131-fdba-4869-9fbc-01baabe20cf4 req-33c0c93f-bf73-4a46-8759-78c393990c63 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.954 182729 DEBUG oslo_concurrency.lockutils [req-7a0be131-fdba-4869-9fbc-01baabe20cf4 req-33c0c93f-bf73-4a46-8759-78c393990c63 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.954 182729 DEBUG nova.compute.manager [req-7a0be131-fdba-4869-9fbc-01baabe20cf4 req-33c0c93f-bf73-4a46-8759-78c393990c63 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] No waiting events found dispatching network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:49:43 compute-0 nova_compute[182725]: 2026-01-22 22:49:43.955 182729 WARNING nova.compute.manager [req-7a0be131-fdba-4869-9fbc-01baabe20cf4 req-33c0c93f-bf73-4a46-8759-78c393990c63 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received unexpected event network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 for instance with vm_state active and task_state None.
Jan 22 22:49:44 compute-0 nova_compute[182725]: 2026-01-22 22:49:44.031 182729 DEBUG nova.compute.manager [req-3b16ed60-791e-4c15-945d-ea3e046f327c req-52cf58e8-f92b-4b20-bc1c-d0276334d1e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:44 compute-0 nova_compute[182725]: 2026-01-22 22:49:44.031 182729 DEBUG oslo_concurrency.lockutils [req-3b16ed60-791e-4c15-945d-ea3e046f327c req-52cf58e8-f92b-4b20-bc1c-d0276334d1e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:49:44 compute-0 nova_compute[182725]: 2026-01-22 22:49:44.032 182729 DEBUG oslo_concurrency.lockutils [req-3b16ed60-791e-4c15-945d-ea3e046f327c req-52cf58e8-f92b-4b20-bc1c-d0276334d1e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:49:44 compute-0 nova_compute[182725]: 2026-01-22 22:49:44.032 182729 DEBUG oslo_concurrency.lockutils [req-3b16ed60-791e-4c15-945d-ea3e046f327c req-52cf58e8-f92b-4b20-bc1c-d0276334d1e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:49:44 compute-0 nova_compute[182725]: 2026-01-22 22:49:44.032 182729 DEBUG nova.compute.manager [req-3b16ed60-791e-4c15-945d-ea3e046f327c req-52cf58e8-f92b-4b20-bc1c-d0276334d1e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] No waiting events found dispatching network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:49:44 compute-0 nova_compute[182725]: 2026-01-22 22:49:44.033 182729 WARNING nova.compute.manager [req-3b16ed60-791e-4c15-945d-ea3e046f327c req-52cf58e8-f92b-4b20-bc1c-d0276334d1e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received unexpected event network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 for instance with vm_state active and task_state None.
Jan 22 22:49:44 compute-0 nova_compute[182725]: 2026-01-22 22:49:44.591 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:45 compute-0 nova_compute[182725]: 2026-01-22 22:49:45.349 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:45 compute-0 nova_compute[182725]: 2026-01-22 22:49:45.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:45 compute-0 nova_compute[182725]: 2026-01-22 22:49:45.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:48 compute-0 nova_compute[182725]: 2026-01-22 22:49:48.935 182729 DEBUG nova.compute.manager [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-changed-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:49:48 compute-0 nova_compute[182725]: 2026-01-22 22:49:48.935 182729 DEBUG nova.compute.manager [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing instance network info cache due to event network-changed-f0a86929-bba0-4a5d-9a12-6ac73816e0b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:49:48 compute-0 nova_compute[182725]: 2026-01-22 22:49:48.935 182729 DEBUG oslo_concurrency.lockutils [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:49:48 compute-0 nova_compute[182725]: 2026-01-22 22:49:48.935 182729 DEBUG oslo_concurrency.lockutils [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:49:48 compute-0 nova_compute[182725]: 2026-01-22 22:49:48.936 182729 DEBUG nova.network.neutron [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing network info cache for port f0a86929-bba0-4a5d-9a12-6ac73816e0b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:49:49 compute-0 podman[236065]: 2026-01-22 22:49:49.132566705 +0000 UTC m=+0.063177603 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64)
Jan 22 22:49:49 compute-0 podman[236064]: 2026-01-22 22:49:49.161528166 +0000 UTC m=+0.091470327 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:49:49 compute-0 nova_compute[182725]: 2026-01-22 22:49:49.593 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:50 compute-0 nova_compute[182725]: 2026-01-22 22:49:50.352 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:50 compute-0 nova_compute[182725]: 2026-01-22 22:49:50.739 182729 DEBUG nova.network.neutron [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updated VIF entry in instance network info cache for port f0a86929-bba0-4a5d-9a12-6ac73816e0b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:49:50 compute-0 nova_compute[182725]: 2026-01-22 22:49:50.740 182729 DEBUG nova.network.neutron [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updating instance_info_cache with network_info: [{"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:49:50 compute-0 nova_compute[182725]: 2026-01-22 22:49:50.759 182729 DEBUG oslo_concurrency.lockutils [req-05597b9b-8cb8-4d1b-996c-27be52c55b08 req-4c83d8dd-f69d-4a64-834d-fd24b904a59e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:49:54 compute-0 nova_compute[182725]: 2026-01-22 22:49:54.595 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:55 compute-0 ovn_controller[94850]: 2026-01-22T22:49:55Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:c2:ee 10.100.0.5
Jan 22 22:49:55 compute-0 ovn_controller[94850]: 2026-01-22T22:49:55Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:c2:ee 10.100.0.5
Jan 22 22:49:55 compute-0 nova_compute[182725]: 2026-01-22 22:49:55.196 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:55 compute-0 nova_compute[182725]: 2026-01-22 22:49:55.354 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:56 compute-0 nova_compute[182725]: 2026-01-22 22:49:56.893 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:49:59 compute-0 nova_compute[182725]: 2026-01-22 22:49:59.106 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:49:59 compute-0 podman[236127]: 2026-01-22 22:49:59.162198313 +0000 UTC m=+0.077381396 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:49:59 compute-0 podman[236128]: 2026-01-22 22:49:59.162705166 +0000 UTC m=+0.089698393 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 22:49:59 compute-0 podman[236129]: 2026-01-22 22:49:59.177877713 +0000 UTC m=+0.084962855 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:49:59 compute-0 nova_compute[182725]: 2026-01-22 22:49:59.597 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:00 compute-0 nova_compute[182725]: 2026-01-22 22:50:00.357 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:04 compute-0 nova_compute[182725]: 2026-01-22 22:50:04.600 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:05 compute-0 nova_compute[182725]: 2026-01-22 22:50:05.361 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:06 compute-0 nova_compute[182725]: 2026-01-22 22:50:06.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:06 compute-0 nova_compute[182725]: 2026-01-22 22:50:06.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:50:06 compute-0 nova_compute[182725]: 2026-01-22 22:50:06.909 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.609 182729 DEBUG nova.compute.manager [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-changed-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.610 182729 DEBUG nova.compute.manager [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing instance network info cache due to event network-changed-f0a86929-bba0-4a5d-9a12-6ac73816e0b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.611 182729 DEBUG oslo_concurrency.lockutils [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.611 182729 DEBUG oslo_concurrency.lockutils [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.612 182729 DEBUG nova.network.neutron [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Refreshing network info cache for port f0a86929-bba0-4a5d-9a12-6ac73816e0b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.666 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.667 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.668 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.668 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.669 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.683 182729 INFO nova.compute.manager [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Terminating instance
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.698 182729 DEBUG nova.compute.manager [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:50:08 compute-0 kernel: tapf0a86929-bb (unregistering): left promiscuous mode
Jan 22 22:50:08 compute-0 NetworkManager[54954]: <info>  [1769122208.7334] device (tapf0a86929-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00678|binding|INFO|Releasing lport f0a86929-bba0-4a5d-9a12-6ac73816e0b3 from this chassis (sb_readonly=0)
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00679|binding|INFO|Setting lport f0a86929-bba0-4a5d-9a12-6ac73816e0b3 down in Southbound
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.744 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00680|binding|INFO|Removing iface tapf0a86929-bb ovn-installed in OVS
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.749 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:08 compute-0 kernel: tapc90439e7-3e (unregistering): left promiscuous mode
Jan 22 22:50:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:08.761 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c2:ee 10.100.0.5'], port_security=['fa:16:3e:d6:c2:ee 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'faed94a0-9d35-4cef-9a93-8676494aefb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a73a41f0-812c-4e4a-8c47-e2095e6414a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43064b03-af4b-4784-84e9-32242d15fee5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=f0a86929-bba0-4a5d-9a12-6ac73816e0b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:50:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:08.763 104215 INFO neutron.agent.ovn.metadata.agent [-] Port f0a86929-bba0-4a5d-9a12-6ac73816e0b3 in datapath 1e744b0e-1559-4ac7-9396-bcabe1d9688d unbound from our chassis
Jan 22 22:50:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:08.765 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1e744b0e-1559-4ac7-9396-bcabe1d9688d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:50:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:08.766 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4970bf-cc05-4149-b6d2-2aef0af56a36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:08.767 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d namespace which is not needed anymore
Jan 22 22:50:08 compute-0 NetworkManager[54954]: <info>  [1769122208.7796] device (tapc90439e7-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.783 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00681|binding|INFO|Releasing lport c90439e7-3e16-4e66-bbc5-02906db93e08 from this chassis (sb_readonly=1)
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00682|binding|INFO|Removing iface tapc90439e7-3e ovn-installed in OVS
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00683|if_status|INFO|Dropped 2 log messages in last 360 seconds (most recently, 360 seconds ago) due to excessive rate
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00684|if_status|INFO|Not setting lport c90439e7-3e16-4e66-bbc5-02906db93e08 down as sb is readonly
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.796 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:08.797 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:50:08 compute-0 ovn_controller[94850]: 2026-01-22T22:50:08Z|00685|binding|INFO|Setting lport c90439e7-3e16-4e66-bbc5-02906db93e08 down in Southbound
Jan 22 22:50:08 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:08.810 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:14:85 2001:db8:0:1:f816:3eff:fe97:1485 2001:db8::f816:3eff:fe97:1485'], port_security=['fa:16:3e:97:14:85 2001:db8:0:1:f816:3eff:fe97:1485 2001:db8::f816:3eff:fe97:1485'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe97:1485/64 2001:db8::f816:3eff:fe97:1485/64', 'neutron:device_id': 'faed94a0-9d35-4cef-9a93-8676494aefb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a73a41f0-812c-4e4a-8c47-e2095e6414a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30651656-9209-4f2c-a0e4-55fbbfbf46e6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=c90439e7-3e16-4e66-bbc5-02906db93e08) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.827 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:08 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Jan 22 22:50:08 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a6.scope: Consumed 13.699s CPU time.
Jan 22 22:50:08 compute-0 systemd-machined[154006]: Machine qemu-72-instance-000000a6 terminated.
Jan 22 22:50:08 compute-0 neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d[235975]: [NOTICE]   (235979) : haproxy version is 2.8.14-c23fe91
Jan 22 22:50:08 compute-0 neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d[235975]: [NOTICE]   (235979) : path to executable is /usr/sbin/haproxy
Jan 22 22:50:08 compute-0 neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d[235975]: [WARNING]  (235979) : Exiting Master process...
Jan 22 22:50:08 compute-0 neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d[235975]: [ALERT]    (235979) : Current worker (235981) exited with code 143 (Terminated)
Jan 22 22:50:08 compute-0 neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d[235975]: [WARNING]  (235979) : All workers exited. Exiting... (0)
Jan 22 22:50:08 compute-0 systemd[1]: libpod-e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88.scope: Deactivated successfully.
Jan 22 22:50:08 compute-0 podman[236218]: 2026-01-22 22:50:08.910178472 +0000 UTC m=+0.045248307 container died e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:50:08 compute-0 NetworkManager[54954]: <info>  [1769122208.9211] manager: (tapf0a86929-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Jan 22 22:50:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88-userdata-shm.mount: Deactivated successfully.
Jan 22 22:50:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-15efc2eea3cf02bd146df56b54715d437178c26e59888e7dc7e405a6a923dfb4-merged.mount: Deactivated successfully.
Jan 22 22:50:08 compute-0 podman[236218]: 2026-01-22 22:50:08.957870558 +0000 UTC m=+0.092940353 container cleanup e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:50:08 compute-0 systemd[1]: libpod-conmon-e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88.scope: Deactivated successfully.
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.982 182729 INFO nova.virt.libvirt.driver [-] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Instance destroyed successfully.
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.983 182729 DEBUG nova.objects.instance [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid faed94a0-9d35-4cef-9a93-8676494aefb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.997 182729 DEBUG nova.virt.libvirt.vif [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:49:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1483093445',display_name='tempest-TestGettingAddress-server-1483093445',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1483093445',id=166,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVBrWhrWV2+OK0B0KsdlZt/7BxoMVzT16QtwS4PWoBh6Eqvfi8VtswQVJGTO+qQyjO+HTEKK/1qaqZ9dH7Hxk9McYjzl5QMixkUqAUPMLu8ir8hdly+vRL9+JEZ9ZwGOg==',key_name='tempest-TestGettingAddress-1359973904',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:49:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-z3qtw9st',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:49:42Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=faed94a0-9d35-4cef-9a93-8676494aefb6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.997 182729 DEBUG nova.network.os_vif_util [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.998 182729 DEBUG nova.network.os_vif_util [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:c2:ee,bridge_name='br-int',has_traffic_filtering=True,id=f0a86929-bba0-4a5d-9a12-6ac73816e0b3,network=Network(1e744b0e-1559-4ac7-9396-bcabe1d9688d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0a86929-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:50:08 compute-0 nova_compute[182725]: 2026-01-22 22:50:08.998 182729 DEBUG os_vif [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:c2:ee,bridge_name='br-int',has_traffic_filtering=True,id=f0a86929-bba0-4a5d-9a12-6ac73816e0b3,network=Network(1e744b0e-1559-4ac7-9396-bcabe1d9688d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0a86929-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.000 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.000 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0a86929-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.002 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.004 182729 DEBUG nova.compute.manager [req-eaf35d32-7299-4b9e-9ab4-6f51ee264bac req-4ad34d5c-f899-4af6-9374-bbc4b4c6a741 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-unplugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.005 182729 DEBUG oslo_concurrency.lockutils [req-eaf35d32-7299-4b9e-9ab4-6f51ee264bac req-4ad34d5c-f899-4af6-9374-bbc4b4c6a741 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.005 182729 DEBUG oslo_concurrency.lockutils [req-eaf35d32-7299-4b9e-9ab4-6f51ee264bac req-4ad34d5c-f899-4af6-9374-bbc4b4c6a741 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.005 182729 DEBUG oslo_concurrency.lockutils [req-eaf35d32-7299-4b9e-9ab4-6f51ee264bac req-4ad34d5c-f899-4af6-9374-bbc4b4c6a741 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.006 182729 DEBUG nova.compute.manager [req-eaf35d32-7299-4b9e-9ab4-6f51ee264bac req-4ad34d5c-f899-4af6-9374-bbc4b4c6a741 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] No waiting events found dispatching network-vif-unplugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.006 182729 DEBUG nova.compute.manager [req-eaf35d32-7299-4b9e-9ab4-6f51ee264bac req-4ad34d5c-f899-4af6-9374-bbc4b4c6a741 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-unplugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.006 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.008 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.010 182729 INFO os_vif [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:c2:ee,bridge_name='br-int',has_traffic_filtering=True,id=f0a86929-bba0-4a5d-9a12-6ac73816e0b3,network=Network(1e744b0e-1559-4ac7-9396-bcabe1d9688d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0a86929-bb')
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.011 182729 DEBUG nova.virt.libvirt.vif [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:49:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1483093445',display_name='tempest-TestGettingAddress-server-1483093445',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1483093445',id=166,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBVBrWhrWV2+OK0B0KsdlZt/7BxoMVzT16QtwS4PWoBh6Eqvfi8VtswQVJGTO+qQyjO+HTEKK/1qaqZ9dH7Hxk9McYjzl5QMixkUqAUPMLu8ir8hdly+vRL9+JEZ9ZwGOg==',key_name='tempest-TestGettingAddress-1359973904',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:49:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-z3qtw9st',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:49:42Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=faed94a0-9d35-4cef-9a93-8676494aefb6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.011 182729 DEBUG nova.network.os_vif_util [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.012 182729 DEBUG nova.network.os_vif_util [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:14:85,bridge_name='br-int',has_traffic_filtering=True,id=c90439e7-3e16-4e66-bbc5-02906db93e08,network=Network(7b0b7be0-dc91-4e0d-bd73-07331822edfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc90439e7-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.012 182729 DEBUG os_vif [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:14:85,bridge_name='br-int',has_traffic_filtering=True,id=c90439e7-3e16-4e66-bbc5-02906db93e08,network=Network(7b0b7be0-dc91-4e0d-bd73-07331822edfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc90439e7-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.013 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.013 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc90439e7-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.014 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.015 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.017 182729 INFO os_vif [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:14:85,bridge_name='br-int',has_traffic_filtering=True,id=c90439e7-3e16-4e66-bbc5-02906db93e08,network=Network(7b0b7be0-dc91-4e0d-bd73-07331822edfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc90439e7-3e')
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.018 182729 INFO nova.virt.libvirt.driver [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Deleting instance files /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6_del
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.018 182729 INFO nova.virt.libvirt.driver [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Deletion of /var/lib/nova/instances/faed94a0-9d35-4cef-9a93-8676494aefb6_del complete
Jan 22 22:50:09 compute-0 podman[236274]: 2026-01-22 22:50:09.031362637 +0000 UTC m=+0.044195501 container remove e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.036 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0b57bd88-af48-4bc4-9ffe-be74e142ff5a]: (4, ('Thu Jan 22 10:50:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d (e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88)\ne6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88\nThu Jan 22 10:50:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d (e6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88)\ne6b725d43250f7f952d0f10037c20266e403b14131c6ef87301306bd8d297c88\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.038 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e58e2c2b-c2cc-403f-90a6-5b5bff38c3bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.038 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e744b0e-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.039 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 kernel: tap1e744b0e-10: left promiscuous mode
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.050 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.054 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfcfa26-75b1-497c-87bc-4a29c3d0ea1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.076 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e77a5642-f283-48f4-86ad-a79184549f31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.078 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[af155949-4a1f-474b-9c22-4c2fe025e8ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.082 182729 INFO nova.compute.manager [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.082 182729 DEBUG oslo.service.loopingcall [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.083 182729 DEBUG nova.compute.manager [-] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.083 182729 DEBUG nova.network.neutron [-] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.103 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[462a5bef-d90a-40c7-912b-18ce45e05214]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581890, 'reachable_time': 21446, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236293, 'error': None, 'target': 'ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.106 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1e744b0e-1559-4ac7-9396-bcabe1d9688d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.106 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c21517-0002-4382-b5c1-86d31cc8f56b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.106 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.107 104215 INFO neutron.agent.ovn.metadata.agent [-] Port c90439e7-3e16-4e66-bbc5-02906db93e08 in datapath 7b0b7be0-dc91-4e0d-bd73-07331822edfa unbound from our chassis
Jan 22 22:50:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d1e744b0e\x2d1559\x2d4ac7\x2d9396\x2dbcabe1d9688d.mount: Deactivated successfully.
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.108 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b0b7be0-dc91-4e0d-bd73-07331822edfa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.109 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b6672138-61bd-4a11-ab67-7ebd8049ae0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.109 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa namespace which is not needed anymore
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:50:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:50:09 compute-0 neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa[236049]: [NOTICE]   (236053) : haproxy version is 2.8.14-c23fe91
Jan 22 22:50:09 compute-0 neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa[236049]: [NOTICE]   (236053) : path to executable is /usr/sbin/haproxy
Jan 22 22:50:09 compute-0 neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa[236049]: [WARNING]  (236053) : Exiting Master process...
Jan 22 22:50:09 compute-0 neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa[236049]: [ALERT]    (236053) : Current worker (236055) exited with code 143 (Terminated)
Jan 22 22:50:09 compute-0 neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa[236049]: [WARNING]  (236053) : All workers exited. Exiting... (0)
Jan 22 22:50:09 compute-0 systemd[1]: libpod-d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3.scope: Deactivated successfully.
Jan 22 22:50:09 compute-0 podman[236311]: 2026-01-22 22:50:09.33048443 +0000 UTC m=+0.117623648 container died d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:50:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3-userdata-shm.mount: Deactivated successfully.
Jan 22 22:50:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-e26effa7f7b862215acdbd88b61c092c840ed2761ddaa45e4c0ca6b7f8b59b13-merged.mount: Deactivated successfully.
Jan 22 22:50:09 compute-0 podman[236311]: 2026-01-22 22:50:09.37429955 +0000 UTC m=+0.161438788 container cleanup d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:50:09 compute-0 systemd[1]: libpod-conmon-d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3.scope: Deactivated successfully.
Jan 22 22:50:09 compute-0 podman[236342]: 2026-01-22 22:50:09.44140225 +0000 UTC m=+0.043221076 container remove d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.449 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ca092697-c82b-423a-a560-d4a49796b674]: (4, ('Thu Jan 22 10:50:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa (d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3)\nd70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3\nThu Jan 22 10:50:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa (d70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3)\nd70d0f9367df1d942c85b21af00856053bd812b1bddfd543a675ded0aed9a2f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.450 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3a0287-6088-4e33-93ec-021a5943ba8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.451 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b0b7be0-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.453 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 kernel: tap7b0b7be0-d0: left promiscuous mode
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.479 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[db282bc2-151a-4e7f-8744-1a31650e8edc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.485 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.497 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[46a4e0cd-2bac-47c0-b3e6-c1c55f8963a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.498 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2ba6f358-6146-4d91-953b-be5deac1fe08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.522 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9240d855-a1c0-49c7-8d1b-db6ed93c22b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581993, 'reachable_time': 23495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236357, 'error': None, 'target': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.524 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:50:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:09.525 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0ea70a-1c98-4398-9b97-61ff151dbdfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:50:09 compute-0 nova_compute[182725]: 2026-01-22 22:50:09.603 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b0b7be0\x2ddc91\x2d4e0d\x2dbd73\x2d07331822edfa.mount: Deactivated successfully.
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.394 182729 DEBUG nova.compute.manager [req-87c60ce1-4b87-4b0e-adb4-631da6b27a41 req-b7949179-1da9-46f2-9345-2b6675f6b484 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-deleted-c90439e7-3e16-4e66-bbc5-02906db93e08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.395 182729 INFO nova.compute.manager [req-87c60ce1-4b87-4b0e-adb4-631da6b27a41 req-b7949179-1da9-46f2-9345-2b6675f6b484 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Neutron deleted interface c90439e7-3e16-4e66-bbc5-02906db93e08; detaching it from the instance and deleting it from the info cache
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.395 182729 DEBUG nova.network.neutron [req-87c60ce1-4b87-4b0e-adb4-631da6b27a41 req-b7949179-1da9-46f2-9345-2b6675f6b484 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updating instance_info_cache with network_info: [{"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.408 182729 DEBUG nova.network.neutron [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updated VIF entry in instance network info cache for port f0a86929-bba0-4a5d-9a12-6ac73816e0b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.409 182729 DEBUG nova.network.neutron [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updating instance_info_cache with network_info: [{"id": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "address": "fa:16:3e:d6:c2:ee", "network": {"id": "1e744b0e-1559-4ac7-9396-bcabe1d9688d", "bridge": "br-int", "label": "tempest-network-smoke--1435433183", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0a86929-bb", "ovs_interfaceid": "f0a86929-bba0-4a5d-9a12-6ac73816e0b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c90439e7-3e16-4e66-bbc5-02906db93e08", "address": "fa:16:3e:97:14:85", "network": {"id": "7b0b7be0-dc91-4e0d-bd73-07331822edfa", "bridge": "br-int", "label": "tempest-network-smoke--1174140322", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe97:1485", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc90439e7-3e", "ovs_interfaceid": "c90439e7-3e16-4e66-bbc5-02906db93e08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.421 182729 DEBUG nova.compute.manager [req-87c60ce1-4b87-4b0e-adb4-631da6b27a41 req-b7949179-1da9-46f2-9345-2b6675f6b484 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Detach interface failed, port_id=c90439e7-3e16-4e66-bbc5-02906db93e08, reason: Instance faed94a0-9d35-4cef-9a93-8676494aefb6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.443 182729 DEBUG oslo_concurrency.lockutils [req-1521272a-95aa-439d-b49a-587451c99434 req-2b388c3c-753a-4c21-a0d2-71c4c1082d25 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-faed94a0-9d35-4cef-9a93-8676494aefb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.702 182729 DEBUG nova.compute.manager [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-unplugged-c90439e7-3e16-4e66-bbc5-02906db93e08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.702 182729 DEBUG oslo_concurrency.lockutils [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.703 182729 DEBUG oslo_concurrency.lockutils [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.704 182729 DEBUG oslo_concurrency.lockutils [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.704 182729 DEBUG nova.compute.manager [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] No waiting events found dispatching network-vif-unplugged-c90439e7-3e16-4e66-bbc5-02906db93e08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.704 182729 DEBUG nova.compute.manager [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-unplugged-c90439e7-3e16-4e66-bbc5-02906db93e08 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.705 182729 DEBUG nova.compute.manager [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.705 182729 DEBUG oslo_concurrency.lockutils [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.706 182729 DEBUG oslo_concurrency.lockutils [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.706 182729 DEBUG oslo_concurrency.lockutils [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.707 182729 DEBUG nova.compute.manager [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] No waiting events found dispatching network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:50:10 compute-0 nova_compute[182725]: 2026-01-22 22:50:10.707 182729 WARNING nova.compute.manager [req-bb708eb3-5a4a-41f4-9c16-bf0efdc1cb3b req-19567bcd-771a-4251-bfa8-f18f5f485bd0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received unexpected event network-vif-plugged-c90439e7-3e16-4e66-bbc5-02906db93e08 for instance with vm_state active and task_state deleting.
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.006 182729 DEBUG nova.network.neutron [-] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.029 182729 INFO nova.compute.manager [-] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Took 1.95 seconds to deallocate network for instance.
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.143 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.144 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.164 182729 DEBUG nova.compute.manager [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.164 182729 DEBUG oslo_concurrency.lockutils [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.165 182729 DEBUG oslo_concurrency.lockutils [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.165 182729 DEBUG oslo_concurrency.lockutils [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.165 182729 DEBUG nova.compute.manager [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] No waiting events found dispatching network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.165 182729 WARNING nova.compute.manager [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received unexpected event network-vif-plugged-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 for instance with vm_state deleted and task_state None.
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.199 182729 DEBUG nova.compute.provider_tree [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.213 182729 DEBUG nova.scheduler.client.report [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.240 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.263 182729 INFO nova.scheduler.client.report [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance faed94a0-9d35-4cef-9a93-8676494aefb6
Jan 22 22:50:11 compute-0 nova_compute[182725]: 2026-01-22 22:50:11.329 182729 DEBUG oslo_concurrency.lockutils [None req-e139cd20-0231-4d7f-b246-1213d93300cf 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "faed94a0-9d35-4cef-9a93-8676494aefb6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:12 compute-0 podman[236358]: 2026-01-22 22:50:12.146908988 +0000 UTC m=+0.076270240 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 22:50:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:12.459 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:12.460 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:12.460 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:12 compute-0 nova_compute[182725]: 2026-01-22 22:50:12.522 182729 DEBUG nova.compute.manager [req-c1d3c6a5-cd6e-476f-b252-80ffc4b53c05 req-43c09516-b328-4140-b508-c0fb219fd398 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Received event network-vif-deleted-f0a86929-bba0-4a5d-9a12-6ac73816e0b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:50:14 compute-0 nova_compute[182725]: 2026-01-22 22:50:14.015 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:14 compute-0 nova_compute[182725]: 2026-01-22 22:50:14.604 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:18 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:50:18.107 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:50:19 compute-0 nova_compute[182725]: 2026-01-22 22:50:19.017 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:19 compute-0 nova_compute[182725]: 2026-01-22 22:50:19.607 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:20 compute-0 podman[236382]: 2026-01-22 22:50:20.178831448 +0000 UTC m=+0.095386975 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 22:50:20 compute-0 podman[236381]: 2026-01-22 22:50:20.215141012 +0000 UTC m=+0.145656197 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:50:23 compute-0 nova_compute[182725]: 2026-01-22 22:50:23.981 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122208.9801784, faed94a0-9d35-4cef-9a93-8676494aefb6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:50:23 compute-0 nova_compute[182725]: 2026-01-22 22:50:23.982 182729 INFO nova.compute.manager [-] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] VM Stopped (Lifecycle Event)
Jan 22 22:50:24 compute-0 nova_compute[182725]: 2026-01-22 22:50:24.010 182729 DEBUG nova.compute.manager [None req-682ef33d-b86b-40fe-a59a-bf1c1b7b2ff3 - - - - - -] [instance: faed94a0-9d35-4cef-9a93-8676494aefb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:50:24 compute-0 nova_compute[182725]: 2026-01-22 22:50:24.020 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:24 compute-0 nova_compute[182725]: 2026-01-22 22:50:24.608 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:26 compute-0 nova_compute[182725]: 2026-01-22 22:50:26.122 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:26 compute-0 nova_compute[182725]: 2026-01-22 22:50:26.284 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:26 compute-0 nova_compute[182725]: 2026-01-22 22:50:26.904 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:29 compute-0 nova_compute[182725]: 2026-01-22 22:50:29.021 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:29 compute-0 nova_compute[182725]: 2026-01-22 22:50:29.610 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:30 compute-0 podman[236430]: 2026-01-22 22:50:30.133250304 +0000 UTC m=+0.058670841 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:50:30 compute-0 podman[236431]: 2026-01-22 22:50:30.144694209 +0000 UTC m=+0.064659030 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:50:30 compute-0 podman[236429]: 2026-01-22 22:50:30.174684935 +0000 UTC m=+0.102583963 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:50:34 compute-0 nova_compute[182725]: 2026-01-22 22:50:34.023 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:34 compute-0 nova_compute[182725]: 2026-01-22 22:50:34.612 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:35 compute-0 nova_compute[182725]: 2026-01-22 22:50:35.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:35 compute-0 nova_compute[182725]: 2026-01-22 22:50:35.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:50:35 compute-0 nova_compute[182725]: 2026-01-22 22:50:35.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:50:35 compute-0 nova_compute[182725]: 2026-01-22 22:50:35.903 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:50:37 compute-0 nova_compute[182725]: 2026-01-22 22:50:37.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:38 compute-0 nova_compute[182725]: 2026-01-22 22:50:38.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:38 compute-0 nova_compute[182725]: 2026-01-22 22:50:38.910 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:38 compute-0 nova_compute[182725]: 2026-01-22 22:50:38.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:38 compute-0 nova_compute[182725]: 2026-01-22 22:50:38.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:38 compute-0 nova_compute[182725]: 2026-01-22 22:50:38.911 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.026 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.101 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.103 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5673MB free_disk=73.31695175170898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.104 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.105 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.185 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.186 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.212 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.226 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.246 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.247 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:50:39 compute-0 nova_compute[182725]: 2026-01-22 22:50:39.614 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:40 compute-0 nova_compute[182725]: 2026-01-22 22:50:40.248 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:40 compute-0 nova_compute[182725]: 2026-01-22 22:50:40.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:42 compute-0 nova_compute[182725]: 2026-01-22 22:50:42.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:43 compute-0 podman[236492]: 2026-01-22 22:50:43.55356039 +0000 UTC m=+0.072503335 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:50:44 compute-0 nova_compute[182725]: 2026-01-22 22:50:44.028 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:44 compute-0 nova_compute[182725]: 2026-01-22 22:50:44.615 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:44 compute-0 nova_compute[182725]: 2026-01-22 22:50:44.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:44 compute-0 nova_compute[182725]: 2026-01-22 22:50:44.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:50:45 compute-0 nova_compute[182725]: 2026-01-22 22:50:45.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:50:49 compute-0 nova_compute[182725]: 2026-01-22 22:50:49.029 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:49 compute-0 nova_compute[182725]: 2026-01-22 22:50:49.618 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:51 compute-0 podman[236514]: 2026-01-22 22:50:51.174281999 +0000 UTC m=+0.092906052 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 22:50:51 compute-0 podman[236513]: 2026-01-22 22:50:51.209517476 +0000 UTC m=+0.134484687 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 22:50:54 compute-0 nova_compute[182725]: 2026-01-22 22:50:54.031 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:54 compute-0 nova_compute[182725]: 2026-01-22 22:50:54.621 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:59 compute-0 nova_compute[182725]: 2026-01-22 22:50:59.032 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:50:59 compute-0 nova_compute[182725]: 2026-01-22 22:50:59.623 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:01 compute-0 podman[236565]: 2026-01-22 22:51:01.142629932 +0000 UTC m=+0.063050620 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:51:01 compute-0 podman[236563]: 2026-01-22 22:51:01.149774899 +0000 UTC m=+0.080244817 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:51:01 compute-0 podman[236564]: 2026-01-22 22:51:01.16666474 +0000 UTC m=+0.083712164 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:51:03 compute-0 nova_compute[182725]: 2026-01-22 22:51:03.825 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:04 compute-0 nova_compute[182725]: 2026-01-22 22:51:04.034 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:04 compute-0 nova_compute[182725]: 2026-01-22 22:51:04.625 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:09 compute-0 nova_compute[182725]: 2026-01-22 22:51:09.035 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:09 compute-0 nova_compute[182725]: 2026-01-22 22:51:09.630 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:12.461 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:12.461 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:12.462 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:14 compute-0 nova_compute[182725]: 2026-01-22 22:51:14.038 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:14 compute-0 podman[236628]: 2026-01-22 22:51:14.192765219 +0000 UTC m=+0.106720387 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:51:14 compute-0 nova_compute[182725]: 2026-01-22 22:51:14.641 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:17.085 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:51:17 compute-0 nova_compute[182725]: 2026-01-22 22:51:17.086 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:17 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:17.087 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.042 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:19 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:19.089 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.493 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.494 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.513 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.644 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.664 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.665 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.674 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.675 182729 INFO nova.compute.claims [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.793 182729 DEBUG nova.compute.provider_tree [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.809 182729 DEBUG nova.scheduler.client.report [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.832 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.833 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.924 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.925 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.961 182729 INFO nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:51:19 compute-0 nova_compute[182725]: 2026-01-22 22:51:19.987 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.139 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.143 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.144 182729 INFO nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Creating image(s)
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.145 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.145 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.146 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.166 182729 DEBUG nova.policy [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.170 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.265 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.267 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.268 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.294 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.359 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.361 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.403 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.405 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.405 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.463 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.464 182729 DEBUG nova.virt.disk.api [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.464 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:20 compute-0 ovn_controller[94850]: 2026-01-22T22:51:20Z|00686|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.523 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.524 182729 DEBUG nova.virt.disk.api [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.524 182729 DEBUG nova.objects.instance [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid 22018abd-78e4-4ae1-9dc3-b6575ccec3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.538 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.539 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Ensure instance console log exists: /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.539 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.540 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:20 compute-0 nova_compute[182725]: 2026-01-22 22:51:20.540 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:21 compute-0 nova_compute[182725]: 2026-01-22 22:51:21.844 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Successfully created port: 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:51:22 compute-0 podman[236664]: 2026-01-22 22:51:22.1772887 +0000 UTC m=+0.094663537 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Jan 22 22:51:22 compute-0 podman[236663]: 2026-01-22 22:51:22.215240814 +0000 UTC m=+0.140648990 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 22:51:22 compute-0 nova_compute[182725]: 2026-01-22 22:51:22.326 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Successfully created port: ec90cdd6-97f8-4516-9951-b92319878017 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.045 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.170 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Successfully updated port: 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.295 182729 DEBUG nova.compute.manager [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-changed-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.295 182729 DEBUG nova.compute.manager [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing instance network info cache due to event network-changed-0749b6ee-d9e9-4fcf-897c-8183f2cc8329. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.296 182729 DEBUG oslo_concurrency.lockutils [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.296 182729 DEBUG oslo_concurrency.lockutils [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.297 182729 DEBUG nova.network.neutron [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing network info cache for port 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.471 182729 DEBUG nova.network.neutron [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.688 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.828 182729 DEBUG nova.network.neutron [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.842 182729 DEBUG oslo_concurrency.lockutils [req-5174daa9-5542-4ed5-850a-bbd03e2e903a req-4f9998b2-54da-4f61-83f4-879694338e92 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.882 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Successfully updated port: ec90cdd6-97f8-4516-9951-b92319878017 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.905 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.906 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:51:24 compute-0 nova_compute[182725]: 2026-01-22 22:51:24.906 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:51:25 compute-0 nova_compute[182725]: 2026-01-22 22:51:25.110 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.406 182729 DEBUG nova.compute.manager [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-changed-ec90cdd6-97f8-4516-9951-b92319878017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.407 182729 DEBUG nova.compute.manager [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing instance network info cache due to event network-changed-ec90cdd6-97f8-4516-9951-b92319878017. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.407 182729 DEBUG oslo_concurrency.lockutils [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.853 182729 DEBUG nova.network.neutron [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [{"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.882 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.883 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Instance network_info: |[{"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.883 182729 DEBUG oslo_concurrency.lockutils [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.884 182729 DEBUG nova.network.neutron [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing network info cache for port ec90cdd6-97f8-4516-9951-b92319878017 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.892 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Start _get_guest_xml network_info=[{"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.899 182729 WARNING nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.905 182729 DEBUG nova.virt.libvirt.host [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.906 182729 DEBUG nova.virt.libvirt.host [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.918 182729 DEBUG nova.virt.libvirt.host [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.919 182729 DEBUG nova.virt.libvirt.host [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.921 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.921 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.922 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.922 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.923 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.923 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.924 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.924 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.925 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.925 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.926 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.926 182729 DEBUG nova.virt.hardware [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.932 182729 DEBUG nova.virt.libvirt.vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:51:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1981038866',display_name='tempest-TestGettingAddress-server-1981038866',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1981038866',id=170,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP7wQQgm+nvasdKoo9UnBEEYHBbRtqxVeAMu2PKK2bB3Vs/Knuq7c8Z4OOWqjGda86NxU4fXnsaBYecx+aN6s9vs/St3Y4A0Mcw4px8fHN5PpiAyVLscb9urgoinYOmVhw==',key_name='tempest-TestGettingAddress-796193196',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-cvubh1da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:51:20Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=22018abd-78e4-4ae1-9dc3-b6575ccec3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.933 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.934 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:07:a8,bridge_name='br-int',has_traffic_filtering=True,id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329,network=Network(1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0749b6ee-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.936 182729 DEBUG nova.virt.libvirt.vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:51:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1981038866',display_name='tempest-TestGettingAddress-server-1981038866',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1981038866',id=170,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP7wQQgm+nvasdKoo9UnBEEYHBbRtqxVeAMu2PKK2bB3Vs/Knuq7c8Z4OOWqjGda86NxU4fXnsaBYecx+aN6s9vs/St3Y4A0Mcw4px8fHN5PpiAyVLscb9urgoinYOmVhw==',key_name='tempest-TestGettingAddress-796193196',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-cvubh1da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:51:20Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=22018abd-78e4-4ae1-9dc3-b6575ccec3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.937 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.938 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:b0:63,bridge_name='br-int',has_traffic_filtering=True,id=ec90cdd6-97f8-4516-9951-b92319878017,network=Network(204db677-5698-4972-9966-0fa4e404c5b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec90cdd6-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.940 182729 DEBUG nova.objects.instance [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 22018abd-78e4-4ae1-9dc3-b6575ccec3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.960 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <uuid>22018abd-78e4-4ae1-9dc3-b6575ccec3ab</uuid>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <name>instance-000000aa</name>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <nova:name>tempest-TestGettingAddress-server-1981038866</nova:name>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:51:26</nova:creationTime>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:port uuid="0749b6ee-d9e9-4fcf-897c-8183f2cc8329">
Jan 22 22:51:26 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         <nova:port uuid="ec90cdd6-97f8-4516-9951-b92319878017">
Jan 22 22:51:26 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe0c:b063" ipVersion="6"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <system>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <entry name="serial">22018abd-78e4-4ae1-9dc3-b6575ccec3ab</entry>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <entry name="uuid">22018abd-78e4-4ae1-9dc3-b6575ccec3ab</entry>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </system>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <os>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   </os>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <features>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   </features>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk.config"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:ea:07:a8"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <target dev="tap0749b6ee-d9"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:0c:b0:63"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <target dev="tapec90cdd6-97"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/console.log" append="off"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <video>
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </video>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:51:26 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:51:26 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:51:26 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:51:26 compute-0 nova_compute[182725]: </domain>
Jan 22 22:51:26 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.962 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Preparing to wait for external event network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.962 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.962 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.963 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.963 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Preparing to wait for external event network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.963 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.963 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.964 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.965 182729 DEBUG nova.virt.libvirt.vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:51:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1981038866',display_name='tempest-TestGettingAddress-server-1981038866',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1981038866',id=170,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP7wQQgm+nvasdKoo9UnBEEYHBbRtqxVeAMu2PKK2bB3Vs/Knuq7c8Z4OOWqjGda86NxU4fXnsaBYecx+aN6s9vs/St3Y4A0Mcw4px8fHN5PpiAyVLscb9urgoinYOmVhw==',key_name='tempest-TestGettingAddress-796193196',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-cvubh1da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:51:20Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=22018abd-78e4-4ae1-9dc3-b6575ccec3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.965 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.966 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:07:a8,bridge_name='br-int',has_traffic_filtering=True,id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329,network=Network(1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0749b6ee-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.966 182729 DEBUG os_vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:07:a8,bridge_name='br-int',has_traffic_filtering=True,id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329,network=Network(1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0749b6ee-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.968 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.968 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.969 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.974 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.975 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0749b6ee-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.975 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0749b6ee-d9, col_values=(('external_ids', {'iface-id': '0749b6ee-d9e9-4fcf-897c-8183f2cc8329', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:07:a8', 'vm-uuid': '22018abd-78e4-4ae1-9dc3-b6575ccec3ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.977 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:26 compute-0 NetworkManager[54954]: <info>  [1769122286.9795] manager: (tap0749b6ee-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.979 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.985 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.986 182729 INFO os_vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:07:a8,bridge_name='br-int',has_traffic_filtering=True,id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329,network=Network(1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0749b6ee-d9')
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.987 182729 DEBUG nova.virt.libvirt.vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:51:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1981038866',display_name='tempest-TestGettingAddress-server-1981038866',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1981038866',id=170,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP7wQQgm+nvasdKoo9UnBEEYHBbRtqxVeAMu2PKK2bB3Vs/Knuq7c8Z4OOWqjGda86NxU4fXnsaBYecx+aN6s9vs/St3Y4A0Mcw4px8fHN5PpiAyVLscb9urgoinYOmVhw==',key_name='tempest-TestGettingAddress-796193196',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-cvubh1da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:51:20Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=22018abd-78e4-4ae1-9dc3-b6575ccec3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.988 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.988 182729 DEBUG nova.network.os_vif_util [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:b0:63,bridge_name='br-int',has_traffic_filtering=True,id=ec90cdd6-97f8-4516-9951-b92319878017,network=Network(204db677-5698-4972-9966-0fa4e404c5b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec90cdd6-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.989 182729 DEBUG os_vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:b0:63,bridge_name='br-int',has_traffic_filtering=True,id=ec90cdd6-97f8-4516-9951-b92319878017,network=Network(204db677-5698-4972-9966-0fa4e404c5b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec90cdd6-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.989 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.990 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.990 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.993 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.993 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec90cdd6-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.994 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec90cdd6-97, col_values=(('external_ids', {'iface-id': 'ec90cdd6-97f8-4516-9951-b92319878017', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:b0:63', 'vm-uuid': '22018abd-78e4-4ae1-9dc3-b6575ccec3ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.995 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:26 compute-0 NetworkManager[54954]: <info>  [1769122286.9964] manager: (tapec90cdd6-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 22 22:51:26 compute-0 nova_compute[182725]: 2026-01-22 22:51:26.997 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.004 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.005 182729 INFO os_vif [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:b0:63,bridge_name='br-int',has_traffic_filtering=True,id=ec90cdd6-97f8-4516-9951-b92319878017,network=Network(204db677-5698-4972-9966-0fa4e404c5b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec90cdd6-97')
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.066 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.067 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.067 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:ea:07:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.067 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:0c:b0:63, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.068 182729 INFO nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Using config drive
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.873 182729 INFO nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Creating config drive at /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk.config
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.883 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpssqjddi3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:27 compute-0 nova_compute[182725]: 2026-01-22 22:51:27.916 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.034 182729 DEBUG oslo_concurrency.processutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpssqjddi3" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:28 compute-0 kernel: tap0749b6ee-d9: entered promiscuous mode
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.1410] manager: (tap0749b6ee-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00687|binding|INFO|Claiming lport 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 for this chassis.
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00688|binding|INFO|0749b6ee-d9e9-4fcf-897c-8183f2cc8329: Claiming fa:16:3e:ea:07:a8 10.100.0.5
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.174 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.1901] manager: (tapec90cdd6-97): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Jan 22 22:51:28 compute-0 systemd-udevd[236735]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:51:28 compute-0 kernel: tapec90cdd6-97: entered promiscuous mode
Jan 22 22:51:28 compute-0 systemd-udevd[236737]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.194 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.204 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00689|if_status|INFO|Not updating pb chassis for ec90cdd6-97f8-4516-9951-b92319878017 now as sb is readonly
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.2062] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.2075] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.2116] device (tap0749b6ee-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.2142] device (tap0749b6ee-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.2159] device (tapec90cdd6-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.2177] device (tapec90cdd6-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.224 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:07:a8 10.100.0.5'], port_security=['fa:16:3e:ea:07:a8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '22018abd-78e4-4ae1-9dc3-b6575ccec3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ebfbc13b-ac30-4931-ad50-aee9310d5139', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=183b4411-9dc3-4d8c-b92d-18467d08b35a, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=0749b6ee-d9e9-4fcf-897c-8183f2cc8329) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.226 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 in datapath 1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a bound to our chassis
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.228 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.256 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3792e040-2e40-40c3-9d32-9014135ad2f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.257 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a6ea39e-e1 in ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:51:28 compute-0 systemd-machined[154006]: New machine qemu-73-instance-000000aa.
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.261 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a6ea39e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.261 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3723d1ab-83f1-470f-bc2a-f2dbedf88d84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.263 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[60ec5542-b6be-42e9-829e-63189c53ee11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.278 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[019b105e-ae0d-4520-ae81-9e551f4e8718]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.310 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[748781fd-4443-4b2b-b0b3-8a67b805287c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.341 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 systemd[1]: Started Virtual Machine qemu-73-instance-000000aa.
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.350 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00690|binding|INFO|Claiming lport ec90cdd6-97f8-4516-9951-b92319878017 for this chassis.
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00691|binding|INFO|ec90cdd6-97f8-4516-9951-b92319878017: Claiming fa:16:3e:0c:b0:63 2001:db8::f816:3eff:fe0c:b063
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.361 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b27d38ff-8320-4eba-ae0d-46c9fa983b8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00692|binding|INFO|Setting lport 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 ovn-installed in OVS
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.365 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00693|binding|INFO|Setting lport 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 up in Southbound
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.375 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:b0:63 2001:db8::f816:3eff:fe0c:b063'], port_security=['fa:16:3e:0c:b0:63 2001:db8::f816:3eff:fe0c:b063'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe0c:b063/64', 'neutron:device_id': '22018abd-78e4-4ae1-9dc3-b6575ccec3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-204db677-5698-4972-9966-0fa4e404c5b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ebfbc13b-ac30-4931-ad50-aee9310d5139', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0025dfcd-c081-4212-bd17-0b469a3d3922, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=ec90cdd6-97f8-4516-9951-b92319878017) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00694|binding|INFO|Setting lport ec90cdd6-97f8-4516-9951-b92319878017 ovn-installed in OVS
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00695|binding|INFO|Setting lport ec90cdd6-97f8-4516-9951-b92319878017 up in Southbound
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.3837] manager: (tap1a6ea39e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.384 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.385 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0dabb5d3-f6be-4015-b645-fa8adeef8bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.442 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[02cc192c-a909-4795-8ece-0ea5d378d5dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.447 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[a025f08d-6332-44aa-839c-29c1bc84f8db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.4846] device (tap1a6ea39e-e0): carrier: link connected
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.493 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[64233251-e1ef-43eb-9221-85c9fa83b220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.524 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[67d92987-e90d-4d36-a50e-9a6ad9c18157]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a6ea39e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:d1:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592608, 'reachable_time': 24443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236774, 'error': None, 'target': 'ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.554 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1423a16a-b101-47a9-aa7f-7cf5b7bc69d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:d141'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592608, 'tstamp': 592608}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236775, 'error': None, 'target': 'ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.583 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9781541c-e86b-4ce7-bef5-4257163a03b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a6ea39e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:d1:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592608, 'reachable_time': 24443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236776, 'error': None, 'target': 'ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.584 182729 DEBUG nova.network.neutron [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updated VIF entry in instance network info cache for port ec90cdd6-97f8-4516-9951-b92319878017. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.585 182729 DEBUG nova.network.neutron [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [{"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.621 182729 DEBUG nova.compute.manager [req-6ea4bcea-146d-4d5b-a3b9-4f58d1a4ea01 req-b991e08a-ccd1-44c4-aceb-7019d37e348b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.622 182729 DEBUG oslo_concurrency.lockutils [req-6ea4bcea-146d-4d5b-a3b9-4f58d1a4ea01 req-b991e08a-ccd1-44c4-aceb-7019d37e348b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.623 182729 DEBUG oslo_concurrency.lockutils [req-6ea4bcea-146d-4d5b-a3b9-4f58d1a4ea01 req-b991e08a-ccd1-44c4-aceb-7019d37e348b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.623 182729 DEBUG oslo_concurrency.lockutils [req-6ea4bcea-146d-4d5b-a3b9-4f58d1a4ea01 req-b991e08a-ccd1-44c4-aceb-7019d37e348b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.624 182729 DEBUG nova.compute.manager [req-6ea4bcea-146d-4d5b-a3b9-4f58d1a4ea01 req-b991e08a-ccd1-44c4-aceb-7019d37e348b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Processing event network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.626 182729 DEBUG oslo_concurrency.lockutils [req-59b3f8e6-45d0-4d13-a234-c4b61d66030c req-e206c4b3-2576-4308-8ce4-1be5a46ce141 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.638 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[803fed63-c8c6-42ab-a8bd-bf27840d9ab4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.749 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0e8c6a-f405-469b-91b8-72cd16c39df7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.752 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a6ea39e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.753 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.753 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a6ea39e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.756 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 NetworkManager[54954]: <info>  [1769122288.7578] manager: (tap1a6ea39e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 22 22:51:28 compute-0 kernel: tap1a6ea39e-e0: entered promiscuous mode
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.762 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.764 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a6ea39e-e0, col_values=(('external_ids', {'iface-id': 'd2fe8580-1635-43ef-b939-fe62738e8c36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.765 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 ovn_controller[94850]: 2026-01-22T22:51:28Z|00696|binding|INFO|Releasing lport d2fe8580-1635-43ef-b939-fe62738e8c36 from this chassis (sb_readonly=0)
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.789 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.791 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.792 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[42aef259-ed99-488b-b893-74580e61900c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.793 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a.pid.haproxy
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:51:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:28.794 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'env', 'PROCESS_TAG=haproxy-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.806 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122288.8057058, 22018abd-78e4-4ae1-9dc3-b6575ccec3ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.807 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] VM Started (Lifecycle Event)
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.824 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.830 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122288.8067021, 22018abd-78e4-4ae1-9dc3-b6575ccec3ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.831 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] VM Paused (Lifecycle Event)
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.844 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.846 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:51:28 compute-0 nova_compute[182725]: 2026-01-22 22:51:28.860 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:51:29 compute-0 podman[236816]: 2026-01-22 22:51:29.254180647 +0000 UTC m=+0.085475338 container create 9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:51:29 compute-0 systemd[1]: Started libpod-conmon-9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc.scope.
Jan 22 22:51:29 compute-0 podman[236816]: 2026-01-22 22:51:29.213000182 +0000 UTC m=+0.044294923 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:51:29 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:51:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ef0ecf30e87a8acfeca262d4544371f29b81671236035251560f9018687f630/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:51:29 compute-0 podman[236816]: 2026-01-22 22:51:29.36965132 +0000 UTC m=+0.200946011 container init 9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:51:29 compute-0 podman[236816]: 2026-01-22 22:51:29.381080844 +0000 UTC m=+0.212375545 container start 9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:51:29 compute-0 neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a[236831]: [NOTICE]   (236835) : New worker (236837) forked
Jan 22 22:51:29 compute-0 neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a[236831]: [NOTICE]   (236835) : Loading success.
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.464 104215 INFO neutron.agent.ovn.metadata.agent [-] Port ec90cdd6-97f8-4516-9951-b92319878017 in datapath 204db677-5698-4972-9966-0fa4e404c5b7 unbound from our chassis
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.468 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 204db677-5698-4972-9966-0fa4e404c5b7
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.486 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[96540928-a6ee-4a71-9986-a29f91709729]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.487 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap204db677-51 in ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.489 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap204db677-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.489 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[af873a3d-10ab-4f87-bcd0-aec90f1b1e0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.491 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[46754226-3ae6-4376-b825-ca04efd21396]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.505 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[116585ad-48d0-4f78-810f-72c49a88ee7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.536 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[43f6c134-e783-478a-b018-790adab6ba79]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.580 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[05026e22-9a54-4fcd-9a2c-a670dd66550e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 NetworkManager[54954]: <info>  [1769122289.5923] manager: (tap204db677-50): new Veth device (/org/freedesktop/NetworkManager/Devices/323)
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.591 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[94ef48eb-a169-45be-ad14-ce0f62b2943c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 systemd-udevd[236764]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.643 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[12394893-caed-40b5-bf2e-826a198dc95e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.648 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[052e1c19-f760-4683-9b44-2bbed97b4ddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 NetworkManager[54954]: <info>  [1769122289.6846] device (tap204db677-50): carrier: link connected
Jan 22 22:51:29 compute-0 nova_compute[182725]: 2026-01-22 22:51:29.693 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.694 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[71d158b6-d7f0-4a44-98ec-fdc95bec2663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.723 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb76f67-b7f2-4464-9a04-54168f28a525]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap204db677-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:b7:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592728, 'reachable_time': 33557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236857, 'error': None, 'target': 'ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.751 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6836eb-2a67-46c9-b7cb-b0ba3a500614]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:b78c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592728, 'tstamp': 592728}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236858, 'error': None, 'target': 'ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.782 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2c232452-3eec-47ae-9b3e-8525f46d3994]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap204db677-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:b7:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592728, 'reachable_time': 33557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236859, 'error': None, 'target': 'ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.840 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[60eb7e24-9c54-4f80-959c-ce66442ec565]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.899 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9739215a-6afc-4b93-943b-3e4da05730cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.901 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap204db677-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.902 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.903 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap204db677-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:29 compute-0 NetworkManager[54954]: <info>  [1769122289.9067] manager: (tap204db677-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Jan 22 22:51:29 compute-0 kernel: tap204db677-50: entered promiscuous mode
Jan 22 22:51:29 compute-0 nova_compute[182725]: 2026-01-22 22:51:29.905 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:29 compute-0 nova_compute[182725]: 2026-01-22 22:51:29.910 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.912 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap204db677-50, col_values=(('external_ids', {'iface-id': '6860b24b-069a-43cd-a7e3-0e590e00c9d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:29 compute-0 nova_compute[182725]: 2026-01-22 22:51:29.913 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:29 compute-0 ovn_controller[94850]: 2026-01-22T22:51:29Z|00697|binding|INFO|Releasing lport 6860b24b-069a-43cd-a7e3-0e590e00c9d1 from this chassis (sb_readonly=0)
Jan 22 22:51:29 compute-0 nova_compute[182725]: 2026-01-22 22:51:29.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.939 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/204db677-5698-4972-9966-0fa4e404c5b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/204db677-5698-4972-9966-0fa4e404c5b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.941 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[78e1597e-43ff-4511-969b-ce3d6808bb9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.942 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-204db677-5698-4972-9966-0fa4e404c5b7
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/204db677-5698-4972-9966-0fa4e404c5b7.pid.haproxy
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 204db677-5698-4972-9966-0fa4e404c5b7
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:51:29 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:29.943 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7', 'env', 'PROCESS_TAG=haproxy-204db677-5698-4972-9966-0fa4e404c5b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/204db677-5698-4972-9966-0fa4e404c5b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:51:30 compute-0 podman[236891]: 2026-01-22 22:51:30.392027568 +0000 UTC m=+0.080118764 container create 303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:51:30 compute-0 systemd[1]: Started libpod-conmon-303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551.scope.
Jan 22 22:51:30 compute-0 podman[236891]: 2026-01-22 22:51:30.354236078 +0000 UTC m=+0.042327324 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:51:30 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:51:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bd29bfe82dd93b807c99efc769401a5d00db8d628e8353ec9210e959b59721f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:51:30 compute-0 podman[236891]: 2026-01-22 22:51:30.496014836 +0000 UTC m=+0.184106052 container init 303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:51:30 compute-0 podman[236891]: 2026-01-22 22:51:30.502766713 +0000 UTC m=+0.190857900 container start 303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:51:30 compute-0 neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7[236906]: [NOTICE]   (236910) : New worker (236912) forked
Jan 22 22:51:30 compute-0 neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7[236906]: [NOTICE]   (236910) : Loading success.
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.726 182729 DEBUG nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.727 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.728 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.728 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.729 182729 DEBUG nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] No event matching network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 in dict_keys([('network-vif-plugged', 'ec90cdd6-97f8-4516-9951-b92319878017')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.729 182729 WARNING nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received unexpected event network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 for instance with vm_state building and task_state spawning.
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.729 182729 DEBUG nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.730 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.730 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.730 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.731 182729 DEBUG nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Processing event network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.731 182729 DEBUG nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.732 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.732 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.732 182729 DEBUG oslo_concurrency.lockutils [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.732 182729 DEBUG nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] No waiting events found dispatching network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.733 182729 WARNING nova.compute.manager [req-f652ada8-a472-4c59-b5ae-3bb4409779d5 req-7ddd1b9a-d018-4732-9c7a-0fc07afca44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received unexpected event network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 for instance with vm_state building and task_state spawning.
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.734 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.739 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122290.7388313, 22018abd-78e4-4ae1-9dc3-b6575ccec3ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.739 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] VM Resumed (Lifecycle Event)
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.742 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.747 182729 INFO nova.virt.libvirt.driver [-] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Instance spawned successfully.
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.748 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.778 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.787 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.792 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.793 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.794 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.795 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.796 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.796 182729 DEBUG nova.virt.libvirt.driver [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.828 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.879 182729 INFO nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Took 10.74 seconds to spawn the instance on the hypervisor.
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.879 182729 DEBUG nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.959 182729 INFO nova.compute.manager [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Took 11.37 seconds to build instance.
Jan 22 22:51:30 compute-0 nova_compute[182725]: 2026-01-22 22:51:30.977 182729 DEBUG oslo_concurrency.lockutils [None req-6dfd9ced-7ccd-4d1c-af6d-b5526c1a417e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:32 compute-0 nova_compute[182725]: 2026-01-22 22:51:32.029 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:32 compute-0 podman[236922]: 2026-01-22 22:51:32.140774091 +0000 UTC m=+0.067802268 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 22:51:32 compute-0 podman[236921]: 2026-01-22 22:51:32.158282097 +0000 UTC m=+0.076899525 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 22:51:32 compute-0 podman[236923]: 2026-01-22 22:51:32.189592836 +0000 UTC m=+0.103412354 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:51:33 compute-0 ovn_controller[94850]: 2026-01-22T22:51:33Z|00698|binding|INFO|Releasing lport 6860b24b-069a-43cd-a7e3-0e590e00c9d1 from this chassis (sb_readonly=0)
Jan 22 22:51:33 compute-0 ovn_controller[94850]: 2026-01-22T22:51:33Z|00699|binding|INFO|Releasing lport d2fe8580-1635-43ef-b939-fe62738e8c36 from this chassis (sb_readonly=0)
Jan 22 22:51:33 compute-0 nova_compute[182725]: 2026-01-22 22:51:33.210 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:34 compute-0 nova_compute[182725]: 2026-01-22 22:51:34.252 182729 DEBUG nova.compute.manager [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-changed-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:34 compute-0 nova_compute[182725]: 2026-01-22 22:51:34.253 182729 DEBUG nova.compute.manager [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing instance network info cache due to event network-changed-0749b6ee-d9e9-4fcf-897c-8183f2cc8329. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:51:34 compute-0 nova_compute[182725]: 2026-01-22 22:51:34.254 182729 DEBUG oslo_concurrency.lockutils [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:51:34 compute-0 nova_compute[182725]: 2026-01-22 22:51:34.254 182729 DEBUG oslo_concurrency.lockutils [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:51:34 compute-0 nova_compute[182725]: 2026-01-22 22:51:34.255 182729 DEBUG nova.network.neutron [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing network info cache for port 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:51:34 compute-0 nova_compute[182725]: 2026-01-22 22:51:34.698 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:36 compute-0 nova_compute[182725]: 2026-01-22 22:51:36.121 182729 DEBUG nova.network.neutron [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updated VIF entry in instance network info cache for port 0749b6ee-d9e9-4fcf-897c-8183f2cc8329. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:51:36 compute-0 nova_compute[182725]: 2026-01-22 22:51:36.122 182729 DEBUG nova.network.neutron [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [{"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:36 compute-0 nova_compute[182725]: 2026-01-22 22:51:36.144 182729 DEBUG oslo_concurrency.lockutils [req-47879075-8a48-4987-8934-701814fad1c1 req-84de2141-b676-4dee-be14-dc4bfbd1d623 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:51:36 compute-0 nova_compute[182725]: 2026-01-22 22:51:36.345 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:37 compute-0 nova_compute[182725]: 2026-01-22 22:51:37.031 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:37 compute-0 nova_compute[182725]: 2026-01-22 22:51:37.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:37 compute-0 nova_compute[182725]: 2026-01-22 22:51:37.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:51:37 compute-0 nova_compute[182725]: 2026-01-22 22:51:37.891 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:51:38 compute-0 nova_compute[182725]: 2026-01-22 22:51:38.133 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:51:38 compute-0 nova_compute[182725]: 2026-01-22 22:51:38.134 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:51:38 compute-0 nova_compute[182725]: 2026-01-22 22:51:38.135 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:51:38 compute-0 nova_compute[182725]: 2026-01-22 22:51:38.135 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 22018abd-78e4-4ae1-9dc3-b6575ccec3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:51:39 compute-0 nova_compute[182725]: 2026-01-22 22:51:39.728 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.439 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.540 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [{"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.557 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.558 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.559 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.560 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.585 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.586 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.586 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.587 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.661 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.733 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.734 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.790 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.936 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.937 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5439MB free_disk=73.31610107421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.937 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:40 compute-0 nova_compute[182725]: 2026-01-22 22:51:40.938 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.010 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 22018abd-78e4-4ae1-9dc3-b6575ccec3ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.011 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.011 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.032 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.049 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.050 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.066 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.085 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.132 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.147 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.171 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.172 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.501 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:41 compute-0 nova_compute[182725]: 2026-01-22 22:51:41.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:42 compute-0 nova_compute[182725]: 2026-01-22 22:51:42.071 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:42 compute-0 nova_compute[182725]: 2026-01-22 22:51:42.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:43 compute-0 ovn_controller[94850]: 2026-01-22T22:51:43Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ea:07:a8 10.100.0.5
Jan 22 22:51:43 compute-0 ovn_controller[94850]: 2026-01-22T22:51:43Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ea:07:a8 10.100.0.5
Jan 22 22:51:44 compute-0 nova_compute[182725]: 2026-01-22 22:51:44.728 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:45 compute-0 podman[237009]: 2026-01-22 22:51:45.124665136 +0000 UTC m=+0.060690411 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:51:45 compute-0 nova_compute[182725]: 2026-01-22 22:51:45.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:46 compute-0 nova_compute[182725]: 2026-01-22 22:51:46.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:51:46 compute-0 nova_compute[182725]: 2026-01-22 22:51:46.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:51:47 compute-0 nova_compute[182725]: 2026-01-22 22:51:47.075 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:49 compute-0 nova_compute[182725]: 2026-01-22 22:51:49.731 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.076 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.928 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.928 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.929 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.929 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.929 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.945 182729 INFO nova.compute.manager [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Terminating instance
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.956 182729 DEBUG nova.compute.manager [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:51:52 compute-0 kernel: tap0749b6ee-d9 (unregistering): left promiscuous mode
Jan 22 22:51:52 compute-0 NetworkManager[54954]: <info>  [1769122312.9819] device (tap0749b6ee-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:51:52 compute-0 ovn_controller[94850]: 2026-01-22T22:51:52Z|00700|binding|INFO|Releasing lport 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 from this chassis (sb_readonly=0)
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.992 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:52 compute-0 ovn_controller[94850]: 2026-01-22T22:51:52Z|00701|binding|INFO|Setting lport 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 down in Southbound
Jan 22 22:51:52 compute-0 ovn_controller[94850]: 2026-01-22T22:51:52Z|00702|binding|INFO|Removing iface tap0749b6ee-d9 ovn-installed in OVS
Jan 22 22:51:52 compute-0 nova_compute[182725]: 2026-01-22 22:51:52.995 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.005 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:07:a8 10.100.0.5'], port_security=['fa:16:3e:ea:07:a8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '22018abd-78e4-4ae1-9dc3-b6575ccec3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ebfbc13b-ac30-4931-ad50-aee9310d5139', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=183b4411-9dc3-4d8c-b92d-18467d08b35a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=0749b6ee-d9e9-4fcf-897c-8183f2cc8329) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.008 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 in datapath 1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a unbound from our chassis
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.012 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.013 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ac858190-7df9-41b8-9b1b-687a2ee9a481]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.014 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a namespace which is not needed anymore
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.023 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 kernel: tapec90cdd6-97 (unregistering): left promiscuous mode
Jan 22 22:51:53 compute-0 NetworkManager[54954]: <info>  [1769122313.0311] device (tapec90cdd6-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.054 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.058 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 ovn_controller[94850]: 2026-01-22T22:51:53Z|00703|binding|INFO|Releasing lport ec90cdd6-97f8-4516-9951-b92319878017 from this chassis (sb_readonly=0)
Jan 22 22:51:53 compute-0 ovn_controller[94850]: 2026-01-22T22:51:53Z|00704|binding|INFO|Setting lport ec90cdd6-97f8-4516-9951-b92319878017 down in Southbound
Jan 22 22:51:53 compute-0 ovn_controller[94850]: 2026-01-22T22:51:53Z|00705|binding|INFO|Removing iface tapec90cdd6-97 ovn-installed in OVS
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.079 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:b0:63 2001:db8::f816:3eff:fe0c:b063'], port_security=['fa:16:3e:0c:b0:63 2001:db8::f816:3eff:fe0c:b063'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe0c:b063/64', 'neutron:device_id': '22018abd-78e4-4ae1-9dc3-b6575ccec3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-204db677-5698-4972-9966-0fa4e404c5b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ebfbc13b-ac30-4931-ad50-aee9310d5139', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0025dfcd-c081-4212-bd17-0b469a3d3922, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=ec90cdd6-97f8-4516-9951-b92319878017) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:51:53 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 22 22:51:53 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000aa.scope: Consumed 13.541s CPU time.
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.090 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 systemd-machined[154006]: Machine qemu-73-instance-000000aa terminated.
Jan 22 22:51:53 compute-0 podman[237032]: 2026-01-22 22:51:53.110009228 +0000 UTC m=+0.096384499 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 22 22:51:53 compute-0 podman[237031]: 2026-01-22 22:51:53.161862218 +0000 UTC m=+0.150638719 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a[236831]: [NOTICE]   (236835) : haproxy version is 2.8.14-c23fe91
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a[236831]: [NOTICE]   (236835) : path to executable is /usr/sbin/haproxy
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a[236831]: [WARNING]  (236835) : Exiting Master process...
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a[236831]: [ALERT]    (236835) : Current worker (236837) exited with code 143 (Terminated)
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a[236831]: [WARNING]  (236835) : All workers exited. Exiting... (0)
Jan 22 22:51:53 compute-0 systemd[1]: libpod-9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc.scope: Deactivated successfully.
Jan 22 22:51:53 compute-0 NetworkManager[54954]: <info>  [1769122313.1980] manager: (tapec90cdd6-97): new Tun device (/org/freedesktop/NetworkManager/Devices/325)
Jan 22 22:51:53 compute-0 podman[237103]: 2026-01-22 22:51:53.199288849 +0000 UTC m=+0.052561699 container died 9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 22:51:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ef0ecf30e87a8acfeca262d4544371f29b81671236035251560f9018687f630-merged.mount: Deactivated successfully.
Jan 22 22:51:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc-userdata-shm.mount: Deactivated successfully.
Jan 22 22:51:53 compute-0 podman[237103]: 2026-01-22 22:51:53.244921355 +0000 UTC m=+0.098194235 container cleanup 9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.256 182729 INFO nova.virt.libvirt.driver [-] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Instance destroyed successfully.
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.257 182729 DEBUG nova.objects.instance [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid 22018abd-78e4-4ae1-9dc3-b6575ccec3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:51:53 compute-0 systemd[1]: libpod-conmon-9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc.scope: Deactivated successfully.
Jan 22 22:51:53 compute-0 podman[237159]: 2026-01-22 22:51:53.311649245 +0000 UTC m=+0.042111339 container remove 9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.319 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[08427c60-8ef4-4c8d-b614-4ee23ce8cc55]: (4, ('Thu Jan 22 10:51:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a (9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc)\n9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc\nThu Jan 22 10:51:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a (9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc)\n9bdd845983d64ace58134c48bbd37c3f4390ae794b46638b9ce4df477679ccdc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.321 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[91d937db-f0da-4561-8a8e-b69c37fffa01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.322 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a6ea39e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.323 182729 DEBUG nova.virt.libvirt.vif [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:51:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1981038866',display_name='tempest-TestGettingAddress-server-1981038866',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1981038866',id=170,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP7wQQgm+nvasdKoo9UnBEEYHBbRtqxVeAMu2PKK2bB3Vs/Knuq7c8Z4OOWqjGda86NxU4fXnsaBYecx+aN6s9vs/St3Y4A0Mcw4px8fHN5PpiAyVLscb9urgoinYOmVhw==',key_name='tempest-TestGettingAddress-796193196',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:51:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-cvubh1da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:51:30Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=22018abd-78e4-4ae1-9dc3-b6575ccec3ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.324 182729 DEBUG nova.network.os_vif_util [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "address": "fa:16:3e:ea:07:a8", "network": {"id": "1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a", "bridge": "br-int", "label": "tempest-network-smoke--1767889777", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0749b6ee-d9", "ovs_interfaceid": "0749b6ee-d9e9-4fcf-897c-8183f2cc8329", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.326 182729 DEBUG nova.network.os_vif_util [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:07:a8,bridge_name='br-int',has_traffic_filtering=True,id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329,network=Network(1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0749b6ee-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.327 182729 DEBUG os_vif [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:07:a8,bridge_name='br-int',has_traffic_filtering=True,id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329,network=Network(1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0749b6ee-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.329 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.330 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0749b6ee-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.366 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 kernel: tap1a6ea39e-e0: left promiscuous mode
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.369 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.382 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.384 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb966f5-6e8b-45a1-b7d4-7250980ae579]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.385 182729 INFO os_vif [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:07:a8,bridge_name='br-int',has_traffic_filtering=True,id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329,network=Network(1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0749b6ee-d9')
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.386 182729 DEBUG nova.virt.libvirt.vif [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:51:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1981038866',display_name='tempest-TestGettingAddress-server-1981038866',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1981038866',id=170,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP7wQQgm+nvasdKoo9UnBEEYHBbRtqxVeAMu2PKK2bB3Vs/Knuq7c8Z4OOWqjGda86NxU4fXnsaBYecx+aN6s9vs/St3Y4A0Mcw4px8fHN5PpiAyVLscb9urgoinYOmVhw==',key_name='tempest-TestGettingAddress-796193196',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:51:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-cvubh1da',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:51:30Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=22018abd-78e4-4ae1-9dc3-b6575ccec3ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.386 182729 DEBUG nova.network.os_vif_util [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.387 182729 DEBUG nova.network.os_vif_util [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0c:b0:63,bridge_name='br-int',has_traffic_filtering=True,id=ec90cdd6-97f8-4516-9951-b92319878017,network=Network(204db677-5698-4972-9966-0fa4e404c5b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec90cdd6-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.387 182729 DEBUG os_vif [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:b0:63,bridge_name='br-int',has_traffic_filtering=True,id=ec90cdd6-97f8-4516-9951-b92319878017,network=Network(204db677-5698-4972-9966-0fa4e404c5b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec90cdd6-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.391 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.391 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec90cdd6-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.393 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.394 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.397 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.399 182729 INFO os_vif [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:b0:63,bridge_name='br-int',has_traffic_filtering=True,id=ec90cdd6-97f8-4516-9951-b92319878017,network=Network(204db677-5698-4972-9966-0fa4e404c5b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec90cdd6-97')
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.400 182729 INFO nova.virt.libvirt.driver [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Deleting instance files /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab_del
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.399 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab5f425-6a63-4606-930a-b7ebfacda8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.400 182729 INFO nova.virt.libvirt.driver [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Deletion of /var/lib/nova/instances/22018abd-78e4-4ae1-9dc3-b6575ccec3ab_del complete
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.401 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[398b4992-cbb9-4bc1-a58f-49fe0291891e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.417 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[77f3e5aa-5d20-4ef4-b045-16ee76edbf5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592595, 'reachable_time': 40093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237177, 'error': None, 'target': 'ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.419 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a6ea39e-eedc-45e9-bd61-5f4eb61e5c5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.420 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[a3091cb2-aa42-4a23-85c1-5249316fb5e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d1a6ea39e\x2deedc\x2d45e9\x2dbd61\x2d5f4eb61e5c5a.mount: Deactivated successfully.
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.421 104215 INFO neutron.agent.ovn.metadata.agent [-] Port ec90cdd6-97f8-4516-9951-b92319878017 in datapath 204db677-5698-4972-9966-0fa4e404c5b7 unbound from our chassis
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.422 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 204db677-5698-4972-9966-0fa4e404c5b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.423 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce1baef-38a8-4735-a3af-d5f969197c22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.423 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7 namespace which is not needed anymore
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.439 182729 DEBUG nova.compute.manager [req-c9a23cac-799b-4c6d-9994-1c43fed96a89 req-021a9493-e39c-4605-9f48-63e529ac0071 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-unplugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.440 182729 DEBUG oslo_concurrency.lockutils [req-c9a23cac-799b-4c6d-9994-1c43fed96a89 req-021a9493-e39c-4605-9f48-63e529ac0071 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.440 182729 DEBUG oslo_concurrency.lockutils [req-c9a23cac-799b-4c6d-9994-1c43fed96a89 req-021a9493-e39c-4605-9f48-63e529ac0071 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.440 182729 DEBUG oslo_concurrency.lockutils [req-c9a23cac-799b-4c6d-9994-1c43fed96a89 req-021a9493-e39c-4605-9f48-63e529ac0071 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.440 182729 DEBUG nova.compute.manager [req-c9a23cac-799b-4c6d-9994-1c43fed96a89 req-021a9493-e39c-4605-9f48-63e529ac0071 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] No waiting events found dispatching network-vif-unplugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.440 182729 DEBUG nova.compute.manager [req-c9a23cac-799b-4c6d-9994-1c43fed96a89 req-021a9493-e39c-4605-9f48-63e529ac0071 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-unplugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.462 182729 INFO nova.compute.manager [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Took 0.51 seconds to destroy the instance on the hypervisor.
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.462 182729 DEBUG oslo.service.loopingcall [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.463 182729 DEBUG nova.compute.manager [-] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.463 182729 DEBUG nova.network.neutron [-] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7[236906]: [NOTICE]   (236910) : haproxy version is 2.8.14-c23fe91
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7[236906]: [NOTICE]   (236910) : path to executable is /usr/sbin/haproxy
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7[236906]: [WARNING]  (236910) : Exiting Master process...
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7[236906]: [ALERT]    (236910) : Current worker (236912) exited with code 143 (Terminated)
Jan 22 22:51:53 compute-0 neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7[236906]: [WARNING]  (236910) : All workers exited. Exiting... (0)
Jan 22 22:51:53 compute-0 systemd[1]: libpod-303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551.scope: Deactivated successfully.
Jan 22 22:51:53 compute-0 conmon[236906]: conmon 303a53113585e1efb218 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551.scope/container/memory.events
Jan 22 22:51:53 compute-0 podman[237196]: 2026-01-22 22:51:53.575108141 +0000 UTC m=+0.057957944 container died 303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 22:51:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551-userdata-shm.mount: Deactivated successfully.
Jan 22 22:51:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-5bd29bfe82dd93b807c99efc769401a5d00db8d628e8353ec9210e959b59721f-merged.mount: Deactivated successfully.
Jan 22 22:51:53 compute-0 podman[237196]: 2026-01-22 22:51:53.622667584 +0000 UTC m=+0.105517337 container cleanup 303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:51:53 compute-0 systemd[1]: libpod-conmon-303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551.scope: Deactivated successfully.
Jan 22 22:51:53 compute-0 podman[237226]: 2026-01-22 22:51:53.689105617 +0000 UTC m=+0.043810841 container remove 303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.695 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b038a648-391f-449b-9c8a-7507649a26bf]: (4, ('Thu Jan 22 10:51:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7 (303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551)\n303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551\nThu Jan 22 10:51:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7 (303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551)\n303a53113585e1efb218856e8a21e68d16c814790b9434355b4f2318a7d8d551\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.698 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0da8e3a5-8d12-4322-82fe-e3dcff6ca4f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.700 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap204db677-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.703 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 kernel: tap204db677-50: left promiscuous mode
Jan 22 22:51:53 compute-0 nova_compute[182725]: 2026-01-22 22:51:53.729 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.733 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b3db8682-cfaf-4895-a091-e5464e9590a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.749 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[048a82e6-9abb-44a1-8d00-9427d94d266c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.750 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[20de46c4-b267-41e9-8a07-1114cc024610]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.778 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9a05fe9d-44a2-4b3a-a62e-7346d1a467ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592717, 'reachable_time': 31663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237239, 'error': None, 'target': 'ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.781 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-204db677-5698-4972-9966-0fa4e404c5b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:51:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:51:53.781 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[ad8ecf21-2a2c-4386-9e18-edd80dc7f7af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:51:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d204db677\x2d5698\x2d4972\x2d9966\x2d0fa4e404c5b7.mount: Deactivated successfully.
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.523 182729 DEBUG nova.compute.manager [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-changed-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.524 182729 DEBUG nova.compute.manager [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing instance network info cache due to event network-changed-0749b6ee-d9e9-4fcf-897c-8183f2cc8329. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.525 182729 DEBUG oslo_concurrency.lockutils [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.525 182729 DEBUG oslo_concurrency.lockutils [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.526 182729 DEBUG nova.network.neutron [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Refreshing network info cache for port 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.603 182729 DEBUG nova.compute.manager [req-75365a0f-5e13-4d6b-a248-dfccf88f04f5 req-aec301bc-091f-4378-bceb-4593fa809aca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-deleted-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.603 182729 INFO nova.compute.manager [req-75365a0f-5e13-4d6b-a248-dfccf88f04f5 req-aec301bc-091f-4378-bceb-4593fa809aca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Neutron deleted interface 0749b6ee-d9e9-4fcf-897c-8183f2cc8329; detaching it from the instance and deleting it from the info cache
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.604 182729 DEBUG nova.network.neutron [req-75365a0f-5e13-4d6b-a248-dfccf88f04f5 req-aec301bc-091f-4378-bceb-4593fa809aca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [{"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.652 182729 DEBUG nova.compute.manager [req-75365a0f-5e13-4d6b-a248-dfccf88f04f5 req-aec301bc-091f-4378-bceb-4593fa809aca 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Detach interface failed, port_id=0749b6ee-d9e9-4fcf-897c-8183f2cc8329, reason: Instance 22018abd-78e4-4ae1-9dc3-b6575ccec3ab could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.734 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.740 182729 INFO nova.network.neutron [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Port 0749b6ee-d9e9-4fcf-897c-8183f2cc8329 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.741 182729 DEBUG nova.network.neutron [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [{"id": "ec90cdd6-97f8-4516-9951-b92319878017", "address": "fa:16:3e:0c:b0:63", "network": {"id": "204db677-5698-4972-9966-0fa4e404c5b7", "bridge": "br-int", "label": "tempest-network-smoke--991742531", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0c:b063", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec90cdd6-97", "ovs_interfaceid": "ec90cdd6-97f8-4516-9951-b92319878017", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.769 182729 DEBUG oslo_concurrency.lockutils [req-0930b920-dda0-4f50-af2a-ef457cf80113 req-60271099-1705-4330-aadf-7d5d4ec4e760 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-22018abd-78e4-4ae1-9dc3-b6575ccec3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.886 182729 DEBUG nova.network.neutron [-] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.902 182729 INFO nova.compute.manager [-] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Took 1.44 seconds to deallocate network for instance.
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.996 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:54 compute-0 nova_compute[182725]: 2026-01-22 22:51:54.997 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.050 182729 DEBUG nova.compute.provider_tree [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.073 182729 DEBUG nova.scheduler.client.report [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.091 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.114 182729 INFO nova.scheduler.client.report [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance 22018abd-78e4-4ae1-9dc3-b6575ccec3ab
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.206 182729 DEBUG oslo_concurrency.lockutils [None req-1d7723a1-52d1-43ab-a459-d98d3678856b 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.560 182729 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.561 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.561 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.562 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.562 182729 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] No waiting events found dispatching network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.563 182729 WARNING nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received unexpected event network-vif-plugged-0749b6ee-d9e9-4fcf-897c-8183f2cc8329 for instance with vm_state deleted and task_state None.
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.563 182729 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-unplugged-ec90cdd6-97f8-4516-9951-b92319878017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.564 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.564 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.564 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.565 182729 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] No waiting events found dispatching network-vif-unplugged-ec90cdd6-97f8-4516-9951-b92319878017 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.565 182729 WARNING nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received unexpected event network-vif-unplugged-ec90cdd6-97f8-4516-9951-b92319878017 for instance with vm_state deleted and task_state None.
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.566 182729 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.566 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.567 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.567 182729 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "22018abd-78e4-4ae1-9dc3-b6575ccec3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.567 182729 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] No waiting events found dispatching network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:51:55 compute-0 nova_compute[182725]: 2026-01-22 22:51:55.568 182729 WARNING nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received unexpected event network-vif-plugged-ec90cdd6-97f8-4516-9951-b92319878017 for instance with vm_state deleted and task_state None.
Jan 22 22:51:56 compute-0 nova_compute[182725]: 2026-01-22 22:51:56.759 182729 DEBUG nova.compute.manager [req-8efd6143-129b-44a3-b4a8-dc15293ad555 req-6b882517-6bde-43a4-a798-7298d330a3d2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Received event network-vif-deleted-ec90cdd6-97f8-4516-9951-b92319878017 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:51:58 compute-0 nova_compute[182725]: 2026-01-22 22:51:58.394 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:51:59 compute-0 nova_compute[182725]: 2026-01-22 22:51:59.736 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:00 compute-0 nova_compute[182725]: 2026-01-22 22:52:00.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:03 compute-0 podman[237242]: 2026-01-22 22:52:03.158597728 +0000 UTC m=+0.081037897 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:52:03 compute-0 podman[237240]: 2026-01-22 22:52:03.173048138 +0000 UTC m=+0.093890898 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:52:03 compute-0 podman[237241]: 2026-01-22 22:52:03.173043788 +0000 UTC m=+0.090058512 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:52:03 compute-0 nova_compute[182725]: 2026-01-22 22:52:03.398 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:04 compute-0 nova_compute[182725]: 2026-01-22 22:52:04.778 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:08 compute-0 nova_compute[182725]: 2026-01-22 22:52:08.254 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122313.2529488, 22018abd-78e4-4ae1-9dc3-b6575ccec3ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:52:08 compute-0 nova_compute[182725]: 2026-01-22 22:52:08.255 182729 INFO nova.compute.manager [-] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] VM Stopped (Lifecycle Event)
Jan 22 22:52:08 compute-0 nova_compute[182725]: 2026-01-22 22:52:08.281 182729 DEBUG nova.compute.manager [None req-75012f8a-f9e2-4978-aaa0-36a2997e77c5 - - - - - -] [instance: 22018abd-78e4-4ae1-9dc3-b6575ccec3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:52:08 compute-0 nova_compute[182725]: 2026-01-22 22:52:08.401 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:52:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:52:09 compute-0 nova_compute[182725]: 2026-01-22 22:52:09.806 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:11 compute-0 nova_compute[182725]: 2026-01-22 22:52:11.171 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:11 compute-0 nova_compute[182725]: 2026-01-22 22:52:11.319 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:12.463 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:12.463 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:12.463 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:13 compute-0 nova_compute[182725]: 2026-01-22 22:52:13.405 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:14 compute-0 nova_compute[182725]: 2026-01-22 22:52:14.810 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:16 compute-0 podman[237308]: 2026-01-22 22:52:16.130584797 +0000 UTC m=+0.066086935 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:52:18 compute-0 nova_compute[182725]: 2026-01-22 22:52:18.410 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:19 compute-0 nova_compute[182725]: 2026-01-22 22:52:19.814 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.432 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.433 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.457 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.588 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.589 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.611 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.612 182729 INFO nova.compute.claims [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.756 182729 DEBUG nova.compute.provider_tree [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.772 182729 DEBUG nova.scheduler.client.report [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.795 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.795 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.866 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.867 182729 DEBUG nova.network.neutron [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.882 182729 INFO nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:52:20 compute-0 nova_compute[182725]: 2026-01-22 22:52:20.906 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.061 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.063 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.063 182729 INFO nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Creating image(s)
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.064 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "/var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.064 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.065 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.079 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:21 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:21.079 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:52:21 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:21.081 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.113 182729 DEBUG nova.policy [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.162 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.163 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.164 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.176 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.239 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.240 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.282 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.284 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.284 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.382 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.384 182729 DEBUG nova.virt.disk.api [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Checking if we can resize image /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.384 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.449 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.451 182729 DEBUG nova.virt.disk.api [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Cannot resize image /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.451 182729 DEBUG nova.objects.instance [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'migration_context' on Instance uuid fc2e1e26-1c39-434f-b806-e3c274d18ac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.463 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.463 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Ensure instance console log exists: /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.464 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.464 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.465 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:21 compute-0 nova_compute[182725]: 2026-01-22 22:52:21.956 182729 DEBUG nova.network.neutron [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Successfully created port: 09457a4e-2a61-438d-8819-9da90cc24f75 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:52:22 compute-0 nova_compute[182725]: 2026-01-22 22:52:22.715 182729 DEBUG nova.network.neutron [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Successfully updated port: 09457a4e-2a61-438d-8819-9da90cc24f75 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:52:22 compute-0 nova_compute[182725]: 2026-01-22 22:52:22.733 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:52:22 compute-0 nova_compute[182725]: 2026-01-22 22:52:22.734 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquired lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:52:22 compute-0 nova_compute[182725]: 2026-01-22 22:52:22.734 182729 DEBUG nova.network.neutron [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:52:22 compute-0 nova_compute[182725]: 2026-01-22 22:52:22.907 182729 DEBUG nova.compute.manager [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received event network-changed-09457a4e-2a61-438d-8819-9da90cc24f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:52:22 compute-0 nova_compute[182725]: 2026-01-22 22:52:22.908 182729 DEBUG nova.compute.manager [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Refreshing instance network info cache due to event network-changed-09457a4e-2a61-438d-8819-9da90cc24f75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:52:22 compute-0 nova_compute[182725]: 2026-01-22 22:52:22.908 182729 DEBUG oslo_concurrency.lockutils [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.015 182729 DEBUG nova.network.neutron [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.413 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.852 182729 DEBUG nova.network.neutron [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Updating instance_info_cache with network_info: [{"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.868 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Releasing lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.869 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Instance network_info: |[{"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.869 182729 DEBUG oslo_concurrency.lockutils [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.869 182729 DEBUG nova.network.neutron [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Refreshing network info cache for port 09457a4e-2a61-438d-8819-9da90cc24f75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.872 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Start _get_guest_xml network_info=[{"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.877 182729 WARNING nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.882 182729 DEBUG nova.virt.libvirt.host [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.882 182729 DEBUG nova.virt.libvirt.host [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.889 182729 DEBUG nova.virt.libvirt.host [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.890 182729 DEBUG nova.virt.libvirt.host [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.891 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.892 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.892 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.892 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.892 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.893 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.893 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.893 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.893 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.893 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.894 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.894 182729 DEBUG nova.virt.hardware [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.897 182729 DEBUG nova.virt.libvirt.vif [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:52:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-0-1181037220',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-0-1181037220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-gen',id=172,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0cGK3LKHwzdETPR4h/wsFOlQTgLcTBgGmh+oIVxr2QEdAAv0pBtVq/m3C1CfcZqChuJGcFGvftkc/0Ge0AL9LRFqxGJjVh++AZArtI8LDhTzymo8V5qBXUKlxchJ4llA==',key_name='tempest-TestSecurityGroupsBasicOps-1788231567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-acu8tygj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:52:20Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=fc2e1e26-1c39-434f-b806-e3c274d18ac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.897 182729 DEBUG nova.network.os_vif_util [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.898 182729 DEBUG nova.network.os_vif_util [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:af:64,bridge_name='br-int',has_traffic_filtering=True,id=09457a4e-2a61-438d-8819-9da90cc24f75,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09457a4e-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.899 182729 DEBUG nova.objects.instance [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'pci_devices' on Instance uuid fc2e1e26-1c39-434f-b806-e3c274d18ac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.919 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <uuid>fc2e1e26-1c39-434f-b806-e3c274d18ac1</uuid>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <name>instance-000000ac</name>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-0-1181037220</nova:name>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:52:23</nova:creationTime>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:user uuid="57cadc74575048b298f2ab431b92531e">tempest-TestSecurityGroupsBasicOps-838015615-project-member</nova:user>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:project uuid="bbcf23c8115e43a0af378f72b41c2f1b">tempest-TestSecurityGroupsBasicOps-838015615</nova:project>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         <nova:port uuid="09457a4e-2a61-438d-8819-9da90cc24f75">
Jan 22 22:52:23 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <system>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <entry name="serial">fc2e1e26-1c39-434f-b806-e3c274d18ac1</entry>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <entry name="uuid">fc2e1e26-1c39-434f-b806-e3c274d18ac1</entry>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </system>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <os>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   </os>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <features>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   </features>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk.config"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:98:af:64"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <target dev="tap09457a4e-2a"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/console.log" append="off"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <video>
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </video>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:52:23 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:52:23 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:52:23 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:52:23 compute-0 nova_compute[182725]: </domain>
Jan 22 22:52:23 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.920 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Preparing to wait for external event network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.920 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.921 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.921 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.922 182729 DEBUG nova.virt.libvirt.vif [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:52:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-0-1181037220',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-0-1181037220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-gen',id=172,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0cGK3LKHwzdETPR4h/wsFOlQTgLcTBgGmh+oIVxr2QEdAAv0pBtVq/m3C1CfcZqChuJGcFGvftkc/0Ge0AL9LRFqxGJjVh++AZArtI8LDhTzymo8V5qBXUKlxchJ4llA==',key_name='tempest-TestSecurityGroupsBasicOps-1788231567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-acu8tygj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:52:20Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=fc2e1e26-1c39-434f-b806-e3c274d18ac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.922 182729 DEBUG nova.network.os_vif_util [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.922 182729 DEBUG nova.network.os_vif_util [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:af:64,bridge_name='br-int',has_traffic_filtering=True,id=09457a4e-2a61-438d-8819-9da90cc24f75,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09457a4e-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.923 182729 DEBUG os_vif [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:af:64,bridge_name='br-int',has_traffic_filtering=True,id=09457a4e-2a61-438d-8819-9da90cc24f75,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09457a4e-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.923 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.924 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.924 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.930 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.931 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09457a4e-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.932 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09457a4e-2a, col_values=(('external_ids', {'iface-id': '09457a4e-2a61-438d-8819-9da90cc24f75', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:af:64', 'vm-uuid': 'fc2e1e26-1c39-434f-b806-e3c274d18ac1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.935 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:23 compute-0 NetworkManager[54954]: <info>  [1769122343.9361] manager: (tap09457a4e-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.941 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.945 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:23 compute-0 nova_compute[182725]: 2026-01-22 22:52:23.946 182729 INFO os_vif [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:af:64,bridge_name='br-int',has_traffic_filtering=True,id=09457a4e-2a61-438d-8819-9da90cc24f75,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09457a4e-2a')
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.021 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.022 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.022 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No VIF found with MAC fa:16:3e:98:af:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.023 182729 INFO nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Using config drive
Jan 22 22:52:24 compute-0 podman[237347]: 2026-01-22 22:52:24.056009479 +0000 UTC m=+0.064000004 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 22 22:52:24 compute-0 podman[237346]: 2026-01-22 22:52:24.085957624 +0000 UTC m=+0.096944153 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.416 182729 INFO nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Creating config drive at /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk.config
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.421 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpumctr173 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.545 182729 DEBUG oslo_concurrency.processutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpumctr173" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:24 compute-0 kernel: tap09457a4e-2a: entered promiscuous mode
Jan 22 22:52:24 compute-0 ovn_controller[94850]: 2026-01-22T22:52:24Z|00706|binding|INFO|Claiming lport 09457a4e-2a61-438d-8819-9da90cc24f75 for this chassis.
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.612 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 NetworkManager[54954]: <info>  [1769122344.6152] manager: (tap09457a4e-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/327)
Jan 22 22:52:24 compute-0 ovn_controller[94850]: 2026-01-22T22:52:24Z|00707|binding|INFO|09457a4e-2a61-438d-8819-9da90cc24f75: Claiming fa:16:3e:98:af:64 10.100.0.9
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.616 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.620 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.623 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 systemd-udevd[237407]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:52:24 compute-0 NetworkManager[54954]: <info>  [1769122344.6441] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.642 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 NetworkManager[54954]: <info>  [1769122344.6452] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.650 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:af:64 10.100.0.9'], port_security=['fa:16:3e:98:af:64 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fc2e1e26-1c39-434f-b806-e3c274d18ac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '26a6c206-3f87-4a1c-b08e-7e4fba0abf68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8674c23-6131-45a3-8d73-f4e7ea0ce5e8, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=09457a4e-2a61-438d-8819-9da90cc24f75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.651 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 09457a4e-2a61-438d-8819-9da90cc24f75 in datapath e3bdfa79-ec53-4a7f-b83a-e5086bff52fd bound to our chassis
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.652 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bdfa79-ec53-4a7f-b83a-e5086bff52fd
Jan 22 22:52:24 compute-0 NetworkManager[54954]: <info>  [1769122344.6550] device (tap09457a4e-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:52:24 compute-0 NetworkManager[54954]: <info>  [1769122344.6563] device (tap09457a4e-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:52:24 compute-0 systemd-machined[154006]: New machine qemu-74-instance-000000ac.
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.665 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fec28dea-e2dd-4c7f-b943-fb7bcde6eee0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.666 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3bdfa79-e1 in ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.667 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3bdfa79-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.667 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ab57a0-578e-44a6-848c-0c9a97ef9713]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.668 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[77c6cf81-9f5d-4b7f-860a-221b4deb3ba2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.680 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[41c8eb66-df2d-40dc-816a-a7e2e307dd64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 systemd[1]: Started Virtual Machine qemu-74-instance-000000ac.
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.707 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ba88a357-f5db-4d9f-92f2-7c0cc40cfcb7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.740 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf13b76-ffb4-4401-aa2f-f411f589c8c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.748 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.756 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fe54c560-f976-4124-9a67-b4132f53565d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 NetworkManager[54954]: <info>  [1769122344.7567] manager: (tape3bdfa79-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/330)
Jan 22 22:52:24 compute-0 systemd-udevd[237410]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.759 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 ovn_controller[94850]: 2026-01-22T22:52:24Z|00708|binding|INFO|Setting lport 09457a4e-2a61-438d-8819-9da90cc24f75 ovn-installed in OVS
Jan 22 22:52:24 compute-0 ovn_controller[94850]: 2026-01-22T22:52:24Z|00709|binding|INFO|Setting lport 09457a4e-2a61-438d-8819-9da90cc24f75 up in Southbound
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.771 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.794 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[949f9ad5-f673-4bb2-9953-8be89ef4957e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.798 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f234fe51-c661-4dba-8496-5e389aec4b56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 nova_compute[182725]: 2026-01-22 22:52:24.816 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:24 compute-0 NetworkManager[54954]: <info>  [1769122344.8244] device (tape3bdfa79-e0): carrier: link connected
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.830 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9daee501-b0f3-4d41-a365-c9bdb4eb4e78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.849 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e874a4ce-4a8b-4558-9a09-183a741b4f41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bdfa79-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:a8:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598242, 'reachable_time': 37546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237441, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.869 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9d9dfb-cfba-45ef-955f-51304bd0117c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:a8c4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 598242, 'tstamp': 598242}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237442, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.892 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3daecda7-1c71-4650-a960-03c996443ada]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bdfa79-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:a8:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598242, 'reachable_time': 37546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237443, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:24.939 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[27d1ea41-77ab-4059-9864-45a16c1f7d37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.037 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a8baac98-4b51-4431-a8cc-56e417d36bf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.039 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bdfa79-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.039 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.039 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bdfa79-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:25 compute-0 NetworkManager[54954]: <info>  [1769122345.0423] manager: (tape3bdfa79-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 22 22:52:25 compute-0 kernel: tape3bdfa79-e0: entered promiscuous mode
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.041 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.044 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bdfa79-e0, col_values=(('external_ids', {'iface-id': 'b2aeb8c1-7e46-42c5-86c7-fddcc6060da1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.046 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:25 compute-0 ovn_controller[94850]: 2026-01-22T22:52:25Z|00710|binding|INFO|Releasing lport b2aeb8c1-7e46-42c5-86c7-fddcc6060da1 from this chassis (sb_readonly=0)
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.058 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.059 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.061 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6b48a4d7-735e-4b37-a706-f69ae05ee4b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.061 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.pid.haproxy
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID e3bdfa79-ec53-4a7f-b83a-e5086bff52fd
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:52:25 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:25.063 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'env', 'PROCESS_TAG=haproxy-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:52:25 compute-0 podman[237475]: 2026-01-22 22:52:25.505727591 +0000 UTC m=+0.067534812 container create 5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 22:52:25 compute-0 podman[237475]: 2026-01-22 22:52:25.461434198 +0000 UTC m=+0.023241429 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:52:25 compute-0 systemd[1]: Started libpod-conmon-5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef.scope.
Jan 22 22:52:25 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:52:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4071d94af943f2b2b742ba6a6c55e9bf2c17b3363b8cc561e954b6886167f7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:52:25 compute-0 podman[237475]: 2026-01-22 22:52:25.620223989 +0000 UTC m=+0.182031210 container init 5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:52:25 compute-0 podman[237475]: 2026-01-22 22:52:25.625737277 +0000 UTC m=+0.187544478 container start 5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 22:52:25 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [NOTICE]   (237494) : New worker (237496) forked
Jan 22 22:52:25 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [NOTICE]   (237494) : Loading success.
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.777 182729 DEBUG nova.network.neutron [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Updated VIF entry in instance network info cache for port 09457a4e-2a61-438d-8819-9da90cc24f75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.777 182729 DEBUG nova.network.neutron [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Updating instance_info_cache with network_info: [{"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.793 182729 DEBUG oslo_concurrency.lockutils [req-5fde09cb-59d7-419f-a592-da10b7689c1b req-ea8d5ced-5e84-4baa-b01a-5dc0303b04c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.853 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122345.8522248, fc2e1e26-1c39-434f-b806-e3c274d18ac1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.854 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] VM Started (Lifecycle Event)
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.882 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.896 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122345.8532724, fc2e1e26-1c39-434f-b806-e3c274d18ac1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.897 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] VM Paused (Lifecycle Event)
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.925 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.929 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:52:25 compute-0 nova_compute[182725]: 2026-01-22 22:52:25.946 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:52:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:26.191 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2 2001:db8::f816:3eff:fe8c:995a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8c:995a/64', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35e54559-2757-4541-b2f6-2cb439f23e24, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7d34b7c4-140b-4dda-ad27-aa5734d5709c) old=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:52:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:26.195 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7d34b7c4-140b-4dda-ad27-aa5734d5709c in datapath bcbb187b-81b7-4e4f-9a13-417cae17c3c3 updated
Jan 22 22:52:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:26.197 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcbb187b-81b7-4e4f-9a13-417cae17c3c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:52:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:26.198 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[65757db9-2c2f-456c-a0f7-419e4941195c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:28 compute-0 nova_compute[182725]: 2026-01-22 22:52:28.899 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:28 compute-0 nova_compute[182725]: 2026-01-22 22:52:28.936 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:29 compute-0 nova_compute[182725]: 2026-01-22 22:52:29.820 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:30.083 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.024 182729 DEBUG nova.compute.manager [req-b5a2d7c3-1dda-4bb0-9c32-e53b7a977c27 req-2d08711c-07bb-4d4c-9191-61e29484d707 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received event network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.024 182729 DEBUG oslo_concurrency.lockutils [req-b5a2d7c3-1dda-4bb0-9c32-e53b7a977c27 req-2d08711c-07bb-4d4c-9191-61e29484d707 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.025 182729 DEBUG oslo_concurrency.lockutils [req-b5a2d7c3-1dda-4bb0-9c32-e53b7a977c27 req-2d08711c-07bb-4d4c-9191-61e29484d707 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.026 182729 DEBUG oslo_concurrency.lockutils [req-b5a2d7c3-1dda-4bb0-9c32-e53b7a977c27 req-2d08711c-07bb-4d4c-9191-61e29484d707 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.026 182729 DEBUG nova.compute.manager [req-b5a2d7c3-1dda-4bb0-9c32-e53b7a977c27 req-2d08711c-07bb-4d4c-9191-61e29484d707 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Processing event network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.027 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.032 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122351.0317485, fc2e1e26-1c39-434f-b806-e3c274d18ac1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.032 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] VM Resumed (Lifecycle Event)
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.035 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.040 182729 INFO nova.virt.libvirt.driver [-] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Instance spawned successfully.
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.041 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.241 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.243 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.243 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.244 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.244 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.245 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.245 182729 DEBUG nova.virt.libvirt.driver [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.251 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.511 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.586 182729 INFO nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Took 10.52 seconds to spawn the instance on the hypervisor.
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.586 182729 DEBUG nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:52:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:31.644 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2 2001:db8:0:1:f816:3eff:fe8c:995a 2001:db8::f816:3eff:fe8c:995a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe8c:995a/64 2001:db8::f816:3eff:fe8c:995a/64', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35e54559-2757-4541-b2f6-2cb439f23e24, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7d34b7c4-140b-4dda-ad27-aa5734d5709c) old=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2 2001:db8::f816:3eff:fe8c:995a'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8c:995a/64', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:52:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:31.646 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7d34b7c4-140b-4dda-ad27-aa5734d5709c in datapath bcbb187b-81b7-4e4f-9a13-417cae17c3c3 updated
Jan 22 22:52:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:31.649 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcbb187b-81b7-4e4f-9a13-417cae17c3c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:52:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:31.650 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f590d862-7678-463b-b157-96a4105920d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.704 182729 INFO nova.compute.manager [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Took 11.17 seconds to build instance.
Jan 22 22:52:31 compute-0 nova_compute[182725]: 2026-01-22 22:52:31.726 182729 DEBUG oslo_concurrency.lockutils [None req-c489aac9-0885-4cb8-a71e-85da05e264c1 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:33 compute-0 nova_compute[182725]: 2026-01-22 22:52:33.144 182729 DEBUG nova.compute.manager [req-bc35dd74-77a2-49f5-99b6-ac0510785834 req-f64f1e70-4b62-4e1b-9b8b-d1117f5a1a72 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received event network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:52:33 compute-0 nova_compute[182725]: 2026-01-22 22:52:33.145 182729 DEBUG oslo_concurrency.lockutils [req-bc35dd74-77a2-49f5-99b6-ac0510785834 req-f64f1e70-4b62-4e1b-9b8b-d1117f5a1a72 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:33 compute-0 nova_compute[182725]: 2026-01-22 22:52:33.145 182729 DEBUG oslo_concurrency.lockutils [req-bc35dd74-77a2-49f5-99b6-ac0510785834 req-f64f1e70-4b62-4e1b-9b8b-d1117f5a1a72 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:33 compute-0 nova_compute[182725]: 2026-01-22 22:52:33.145 182729 DEBUG oslo_concurrency.lockutils [req-bc35dd74-77a2-49f5-99b6-ac0510785834 req-f64f1e70-4b62-4e1b-9b8b-d1117f5a1a72 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:33 compute-0 nova_compute[182725]: 2026-01-22 22:52:33.146 182729 DEBUG nova.compute.manager [req-bc35dd74-77a2-49f5-99b6-ac0510785834 req-f64f1e70-4b62-4e1b-9b8b-d1117f5a1a72 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] No waiting events found dispatching network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:52:33 compute-0 nova_compute[182725]: 2026-01-22 22:52:33.146 182729 WARNING nova.compute.manager [req-bc35dd74-77a2-49f5-99b6-ac0510785834 req-f64f1e70-4b62-4e1b-9b8b-d1117f5a1a72 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received unexpected event network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 for instance with vm_state active and task_state None.
Jan 22 22:52:33 compute-0 nova_compute[182725]: 2026-01-22 22:52:33.941 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:34 compute-0 podman[237512]: 2026-01-22 22:52:34.134814639 +0000 UTC m=+0.062006643 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:52:34 compute-0 podman[237513]: 2026-01-22 22:52:34.13563103 +0000 UTC m=+0.062079896 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:52:34 compute-0 podman[237514]: 2026-01-22 22:52:34.180750342 +0000 UTC m=+0.095892686 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:52:34 compute-0 nova_compute[182725]: 2026-01-22 22:52:34.867 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:37 compute-0 nova_compute[182725]: 2026-01-22 22:52:37.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:37 compute-0 nova_compute[182725]: 2026-01-22 22:52:37.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:52:37 compute-0 nova_compute[182725]: 2026-01-22 22:52:37.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:52:38 compute-0 nova_compute[182725]: 2026-01-22 22:52:38.048 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:52:38 compute-0 nova_compute[182725]: 2026-01-22 22:52:38.049 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:52:38 compute-0 nova_compute[182725]: 2026-01-22 22:52:38.049 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:52:38 compute-0 nova_compute[182725]: 2026-01-22 22:52:38.050 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fc2e1e26-1c39-434f-b806-e3c274d18ac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:52:38 compute-0 nova_compute[182725]: 2026-01-22 22:52:38.945 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.351 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Updating instance_info_cache with network_info: [{"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.569 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-fc2e1e26-1c39-434f-b806-e3c274d18ac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.570 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.570 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.869 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.922 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.923 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.924 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:39 compute-0 nova_compute[182725]: 2026-01-22 22:52:39.924 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.011 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.071 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.072 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.132 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.270 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.271 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5561MB free_disk=73.31565475463867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.271 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.272 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.342 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance fc2e1e26-1c39-434f-b806-e3c274d18ac1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.342 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.343 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.377 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.394 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.419 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:52:40 compute-0 nova_compute[182725]: 2026-01-22 22:52:40.419 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:42 compute-0 nova_compute[182725]: 2026-01-22 22:52:42.420 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:42 compute-0 nova_compute[182725]: 2026-01-22 22:52:42.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:43 compute-0 nova_compute[182725]: 2026-01-22 22:52:43.947 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:44 compute-0 nova_compute[182725]: 2026-01-22 22:52:44.871 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:44 compute-0 nova_compute[182725]: 2026-01-22 22:52:44.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:45 compute-0 ovn_controller[94850]: 2026-01-22T22:52:45Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:af:64 10.100.0.9
Jan 22 22:52:45 compute-0 ovn_controller[94850]: 2026-01-22T22:52:45Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:af:64 10.100.0.9
Jan 22 22:52:45 compute-0 nova_compute[182725]: 2026-01-22 22:52:45.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:47 compute-0 podman[237599]: 2026-01-22 22:52:47.165217821 +0000 UTC m=+0.095690412 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 22:52:47 compute-0 nova_compute[182725]: 2026-01-22 22:52:47.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:52:47 compute-0 nova_compute[182725]: 2026-01-22 22:52:47.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:52:48 compute-0 nova_compute[182725]: 2026-01-22 22:52:48.950 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:49 compute-0 nova_compute[182725]: 2026-01-22 22:52:49.873 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.403 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.403 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.404 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.404 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.405 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.420 182729 INFO nova.compute.manager [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Terminating instance
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.435 182729 DEBUG nova.compute.manager [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:52:50 compute-0 kernel: tap09457a4e-2a (unregistering): left promiscuous mode
Jan 22 22:52:50 compute-0 NetworkManager[54954]: <info>  [1769122370.4599] device (tap09457a4e-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.476 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:50 compute-0 ovn_controller[94850]: 2026-01-22T22:52:50Z|00711|binding|INFO|Releasing lport 09457a4e-2a61-438d-8819-9da90cc24f75 from this chassis (sb_readonly=0)
Jan 22 22:52:50 compute-0 ovn_controller[94850]: 2026-01-22T22:52:50Z|00712|binding|INFO|Setting lport 09457a4e-2a61-438d-8819-9da90cc24f75 down in Southbound
Jan 22 22:52:50 compute-0 ovn_controller[94850]: 2026-01-22T22:52:50Z|00713|binding|INFO|Removing iface tap09457a4e-2a ovn-installed in OVS
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.481 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:50.490 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:af:64 10.100.0.9'], port_security=['fa:16:3e:98:af:64 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fc2e1e26-1c39-434f-b806-e3c274d18ac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '26a6c206-3f87-4a1c-b08e-7e4fba0abf68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8674c23-6131-45a3-8d73-f4e7ea0ce5e8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=09457a4e-2a61-438d-8819-9da90cc24f75) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:52:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:50.492 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 09457a4e-2a61-438d-8819-9da90cc24f75 in datapath e3bdfa79-ec53-4a7f-b83a-e5086bff52fd unbound from our chassis
Jan 22 22:52:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:50.495 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:52:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:50.498 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[897e38a8-afb9-4dda-9b84-0755d1021b5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:50 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:50.499 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd namespace which is not needed anymore
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.505 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:50 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000ac.scope: Deactivated successfully.
Jan 22 22:52:50 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000ac.scope: Consumed 13.252s CPU time.
Jan 22 22:52:50 compute-0 systemd-machined[154006]: Machine qemu-74-instance-000000ac terminated.
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.709 182729 INFO nova.virt.libvirt.driver [-] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Instance destroyed successfully.
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.709 182729 DEBUG nova.objects.instance [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'resources' on Instance uuid fc2e1e26-1c39-434f-b806-e3c274d18ac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.724 182729 DEBUG nova.virt.libvirt.vif [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:52:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-0-1181037220',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-0-1181037220',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-gen',id=172,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0cGK3LKHwzdETPR4h/wsFOlQTgLcTBgGmh+oIVxr2QEdAAv0pBtVq/m3C1CfcZqChuJGcFGvftkc/0Ge0AL9LRFqxGJjVh++AZArtI8LDhTzymo8V5qBXUKlxchJ4llA==',key_name='tempest-TestSecurityGroupsBasicOps-1788231567',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:52:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-acu8tygj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:52:31Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=fc2e1e26-1c39-434f-b806-e3c274d18ac1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.725 182729 DEBUG nova.network.os_vif_util [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "09457a4e-2a61-438d-8819-9da90cc24f75", "address": "fa:16:3e:98:af:64", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09457a4e-2a", "ovs_interfaceid": "09457a4e-2a61-438d-8819-9da90cc24f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.725 182729 DEBUG nova.network.os_vif_util [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:af:64,bridge_name='br-int',has_traffic_filtering=True,id=09457a4e-2a61-438d-8819-9da90cc24f75,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09457a4e-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.726 182729 DEBUG os_vif [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:af:64,bridge_name='br-int',has_traffic_filtering=True,id=09457a4e-2a61-438d-8819-9da90cc24f75,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09457a4e-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.728 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.728 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09457a4e-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.730 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.732 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.735 182729 INFO os_vif [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:af:64,bridge_name='br-int',has_traffic_filtering=True,id=09457a4e-2a61-438d-8819-9da90cc24f75,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09457a4e-2a')
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.736 182729 INFO nova.virt.libvirt.driver [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Deleting instance files /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1_del
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.737 182729 INFO nova.virt.libvirt.driver [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Deletion of /var/lib/nova/instances/fc2e1e26-1c39-434f-b806-e3c274d18ac1_del complete
Jan 22 22:52:50 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [NOTICE]   (237494) : haproxy version is 2.8.14-c23fe91
Jan 22 22:52:50 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [NOTICE]   (237494) : path to executable is /usr/sbin/haproxy
Jan 22 22:52:50 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [WARNING]  (237494) : Exiting Master process...
Jan 22 22:52:50 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [WARNING]  (237494) : Exiting Master process...
Jan 22 22:52:50 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [ALERT]    (237494) : Current worker (237496) exited with code 143 (Terminated)
Jan 22 22:52:50 compute-0 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[237490]: [WARNING]  (237494) : All workers exited. Exiting... (0)
Jan 22 22:52:50 compute-0 systemd[1]: libpod-5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef.scope: Deactivated successfully.
Jan 22 22:52:50 compute-0 podman[237644]: 2026-01-22 22:52:50.787754308 +0000 UTC m=+0.175428966 container died 5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.804 182729 INFO nova.compute.manager [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.805 182729 DEBUG oslo.service.loopingcall [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.805 182729 DEBUG nova.compute.manager [-] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:52:50 compute-0 nova_compute[182725]: 2026-01-22 22:52:50.805 182729 DEBUG nova.network.neutron [-] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:52:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef-userdata-shm.mount: Deactivated successfully.
Jan 22 22:52:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-e4071d94af943f2b2b742ba6a6c55e9bf2c17b3363b8cc561e954b6886167f7d-merged.mount: Deactivated successfully.
Jan 22 22:52:51 compute-0 podman[237644]: 2026-01-22 22:52:51.108616842 +0000 UTC m=+0.496291520 container cleanup 5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 22:52:51 compute-0 systemd[1]: libpod-conmon-5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef.scope: Deactivated successfully.
Jan 22 22:52:51 compute-0 podman[237692]: 2026-01-22 22:52:51.218195018 +0000 UTC m=+0.072110275 container remove 5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.225 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1c28ea-48c4-4884-808a-ec2c07147d00]: (4, ('Thu Jan 22 10:52:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd (5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef)\n5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef\nThu Jan 22 10:52:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd (5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef)\n5c8e647264a9f87894e93a76bdaf6d05dd137f0d54b370d6189139ca99709bef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.226 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bd8206-2443-4ac8-b39d-7aaa9f530ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.227 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bdfa79-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.228 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:51 compute-0 kernel: tape3bdfa79-e0: left promiscuous mode
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.240 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.242 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b65fd066-e0ce-4266-80f7-c4d05aeab2b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.253 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8d56b633-d6ef-4f6f-85d3-c6f248002751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.255 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[56e77d5f-4610-40f1-bf27-cf5e77a7116d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.273 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d92687-7a74-4c1d-b4d5-0cd45baf2083]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598233, 'reachable_time': 30770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237710, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:51 compute-0 systemd[1]: run-netns-ovnmeta\x2de3bdfa79\x2dec53\x2d4a7f\x2db83a\x2de5086bff52fd.mount: Deactivated successfully.
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.278 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:52:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:52:51.279 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[af799b2e-fc07-429e-b076-3ec647828dc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.580 182729 DEBUG nova.compute.manager [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received event network-vif-unplugged-09457a4e-2a61-438d-8819-9da90cc24f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.581 182729 DEBUG oslo_concurrency.lockutils [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.581 182729 DEBUG oslo_concurrency.lockutils [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.581 182729 DEBUG oslo_concurrency.lockutils [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.581 182729 DEBUG nova.compute.manager [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] No waiting events found dispatching network-vif-unplugged-09457a4e-2a61-438d-8819-9da90cc24f75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.581 182729 DEBUG nova.compute.manager [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received event network-vif-unplugged-09457a4e-2a61-438d-8819-9da90cc24f75 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.581 182729 DEBUG nova.compute.manager [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received event network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.582 182729 DEBUG oslo_concurrency.lockutils [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.582 182729 DEBUG oslo_concurrency.lockutils [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.582 182729 DEBUG oslo_concurrency.lockutils [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.582 182729 DEBUG nova.compute.manager [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] No waiting events found dispatching network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:52:51 compute-0 nova_compute[182725]: 2026-01-22 22:52:51.582 182729 WARNING nova.compute.manager [req-f40c2b24-8a70-4dbf-b377-683b21ea2940 req-78952c66-fe40-4921-bcf2-470e29cf111d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received unexpected event network-vif-plugged-09457a4e-2a61-438d-8819-9da90cc24f75 for instance with vm_state active and task_state deleting.
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.085 182729 DEBUG nova.network.neutron [-] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.105 182729 INFO nova.compute.manager [-] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Took 1.30 seconds to deallocate network for instance.
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.155 182729 DEBUG nova.compute.manager [req-001bc598-0345-4862-aabf-35255ebf9518 req-08470797-9058-408f-9232-cb709f147bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Received event network-vif-deleted-09457a4e-2a61-438d-8819-9da90cc24f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.178 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.178 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.235 182729 DEBUG nova.compute.provider_tree [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.248 182729 DEBUG nova.scheduler.client.report [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.269 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.294 182729 INFO nova.scheduler.client.report [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Deleted allocations for instance fc2e1e26-1c39-434f-b806-e3c274d18ac1
Jan 22 22:52:52 compute-0 nova_compute[182725]: 2026-01-22 22:52:52.362 182729 DEBUG oslo_concurrency.lockutils [None req-0e233b20-6785-41c4-b341-42ecaa25d8a8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "fc2e1e26-1c39-434f-b806-e3c274d18ac1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:52:54 compute-0 nova_compute[182725]: 2026-01-22 22:52:54.879 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:55 compute-0 podman[237713]: 2026-01-22 22:52:55.162541442 +0000 UTC m=+0.077451338 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public)
Jan 22 22:52:55 compute-0 podman[237712]: 2026-01-22 22:52:55.190344354 +0000 UTC m=+0.118789447 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:52:55 compute-0 nova_compute[182725]: 2026-01-22 22:52:55.763 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:52:59 compute-0 nova_compute[182725]: 2026-01-22 22:52:59.877 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:00 compute-0 nova_compute[182725]: 2026-01-22 22:53:00.766 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:04 compute-0 nova_compute[182725]: 2026-01-22 22:53:04.879 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:05 compute-0 podman[237760]: 2026-01-22 22:53:05.155909466 +0000 UTC m=+0.075289854 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:53:05 compute-0 podman[237762]: 2026-01-22 22:53:05.159197748 +0000 UTC m=+0.072784692 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:53:05 compute-0 podman[237761]: 2026-01-22 22:53:05.180214211 +0000 UTC m=+0.096186334 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 22:53:05 compute-0 nova_compute[182725]: 2026-01-22 22:53:05.708 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122370.70626, fc2e1e26-1c39-434f-b806-e3c274d18ac1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:53:05 compute-0 nova_compute[182725]: 2026-01-22 22:53:05.708 182729 INFO nova.compute.manager [-] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] VM Stopped (Lifecycle Event)
Jan 22 22:53:05 compute-0 nova_compute[182725]: 2026-01-22 22:53:05.734 182729 DEBUG nova.compute.manager [None req-90cfd2ef-339d-4149-81c1-3abe704374bf - - - - - -] [instance: fc2e1e26-1c39-434f-b806-e3c274d18ac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:53:05 compute-0 nova_compute[182725]: 2026-01-22 22:53:05.769 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:06 compute-0 nova_compute[182725]: 2026-01-22 22:53:06.402 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:06 compute-0 nova_compute[182725]: 2026-01-22 22:53:06.515 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:09 compute-0 nova_compute[182725]: 2026-01-22 22:53:09.907 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:10 compute-0 nova_compute[182725]: 2026-01-22 22:53:10.771 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:12.464 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:12.464 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:12.464 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:14 compute-0 nova_compute[182725]: 2026-01-22 22:53:14.908 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:15 compute-0 nova_compute[182725]: 2026-01-22 22:53:15.774 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:18 compute-0 podman[237828]: 2026-01-22 22:53:18.147536765 +0000 UTC m=+0.082568106 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 22:53:19 compute-0 nova_compute[182725]: 2026-01-22 22:53:19.910 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:20 compute-0 nova_compute[182725]: 2026-01-22 22:53:20.776 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:22 compute-0 nova_compute[182725]: 2026-01-22 22:53:22.892 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:22 compute-0 nova_compute[182725]: 2026-01-22 22:53:22.892 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:22 compute-0 nova_compute[182725]: 2026-01-22 22:53:22.928 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.074 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.075 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.083 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.084 182729 INFO nova.compute.claims [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.378 182729 DEBUG nova.compute.provider_tree [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.399 182729 DEBUG nova.scheduler.client.report [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.483 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.484 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.548 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.549 182729 DEBUG nova.network.neutron [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.583 182729 INFO nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.620 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.798 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.799 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.799 182729 INFO nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Creating image(s)
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.800 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "/var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.800 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.801 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.813 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.871 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.872 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.873 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.888 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.987 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:23 compute-0 nova_compute[182725]: 2026-01-22 22:53:23.989 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.029 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.030 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.030 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.102 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.103 182729 DEBUG nova.virt.disk.api [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Checking if we can resize image /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.103 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.158 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.159 182729 DEBUG nova.virt.disk.api [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Cannot resize image /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.159 182729 DEBUG nova.objects.instance [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'migration_context' on Instance uuid f56e3bc0-cc0a-49cb-8197-b90fda658edd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.163 182729 DEBUG nova.policy [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.184 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.185 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Ensure instance console log exists: /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.185 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.186 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.186 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:24 compute-0 nova_compute[182725]: 2026-01-22 22:53:24.912 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:25 compute-0 nova_compute[182725]: 2026-01-22 22:53:25.779 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:26 compute-0 podman[237864]: 2026-01-22 22:53:26.131505288 +0000 UTC m=+0.060279779 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git)
Jan 22 22:53:26 compute-0 podman[237863]: 2026-01-22 22:53:26.238498448 +0000 UTC m=+0.173159415 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:53:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:26.296 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:53:26 compute-0 nova_compute[182725]: 2026-01-22 22:53:26.297 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:26 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:26.298 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:53:26 compute-0 nova_compute[182725]: 2026-01-22 22:53:26.894 182729 DEBUG nova.network.neutron [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Successfully created port: d75dcb2b-f35e-47f7-baba-92d2feff1bb7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:53:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:28.299 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:29 compute-0 nova_compute[182725]: 2026-01-22 22:53:29.260 182729 DEBUG nova.network.neutron [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Successfully updated port: d75dcb2b-f35e-47f7-baba-92d2feff1bb7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:53:29 compute-0 nova_compute[182725]: 2026-01-22 22:53:29.290 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:53:29 compute-0 nova_compute[182725]: 2026-01-22 22:53:29.290 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquired lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:53:29 compute-0 nova_compute[182725]: 2026-01-22 22:53:29.291 182729 DEBUG nova.network.neutron [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:53:29 compute-0 nova_compute[182725]: 2026-01-22 22:53:29.498 182729 DEBUG nova.network.neutron [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:53:29 compute-0 nova_compute[182725]: 2026-01-22 22:53:29.913 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:30 compute-0 nova_compute[182725]: 2026-01-22 22:53:30.783 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:30 compute-0 nova_compute[182725]: 2026-01-22 22:53:30.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:30 compute-0 nova_compute[182725]: 2026-01-22 22:53:30.988 182729 DEBUG nova.network.neutron [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updating instance_info_cache with network_info: [{"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.023 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Releasing lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.024 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Instance network_info: |[{"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.029 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Start _get_guest_xml network_info=[{"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.036 182729 WARNING nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.043 182729 DEBUG nova.virt.libvirt.host [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.043 182729 DEBUG nova.virt.libvirt.host [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.051 182729 DEBUG nova.virt.libvirt.host [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.052 182729 DEBUG nova.virt.libvirt.host [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.053 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.053 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.053 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.053 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.054 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.054 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.054 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.054 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.054 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.055 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.055 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.055 182729 DEBUG nova.virt.hardware [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.058 182729 DEBUG nova.virt.libvirt.vif [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:53:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-231871896',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-231871896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=175,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBqeydfN17hZWjpgNkvtTTCheeYHHKdwSPpmwXNdhEz86TsUWZDN0tyralaf4RnRYw6P5/i8Jibda7fY8XZFO9/IMC66WzzHOcjPi7OrUjOoH49qfsv0+e2l/HBahE0rGA==',key_name='tempest-TestSecurityGroupsBasicOps-1137419110',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-ufu701mn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:53:23Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=f56e3bc0-cc0a-49cb-8197-b90fda658edd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.059 182729 DEBUG nova.network.os_vif_util [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.059 182729 DEBUG nova.network.os_vif_util [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:b1:eb,bridge_name='br-int',has_traffic_filtering=True,id=d75dcb2b-f35e-47f7-baba-92d2feff1bb7,network=Network(f56d94fc-8c7c-4266-8e36-c9ab040ac122),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd75dcb2b-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.060 182729 DEBUG nova.objects.instance [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'pci_devices' on Instance uuid f56e3bc0-cc0a-49cb-8197-b90fda658edd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.073 182729 DEBUG nova.compute.manager [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-changed-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.073 182729 DEBUG nova.compute.manager [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Refreshing instance network info cache due to event network-changed-d75dcb2b-f35e-47f7-baba-92d2feff1bb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.074 182729 DEBUG oslo_concurrency.lockutils [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.074 182729 DEBUG oslo_concurrency.lockutils [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.074 182729 DEBUG nova.network.neutron [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Refreshing network info cache for port d75dcb2b-f35e-47f7-baba-92d2feff1bb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.097 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <uuid>f56e3bc0-cc0a-49cb-8197-b90fda658edd</uuid>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <name>instance-000000af</name>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-231871896</nova:name>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:53:31</nova:creationTime>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:user uuid="57cadc74575048b298f2ab431b92531e">tempest-TestSecurityGroupsBasicOps-838015615-project-member</nova:user>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:project uuid="bbcf23c8115e43a0af378f72b41c2f1b">tempest-TestSecurityGroupsBasicOps-838015615</nova:project>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         <nova:port uuid="d75dcb2b-f35e-47f7-baba-92d2feff1bb7">
Jan 22 22:53:31 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <system>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <entry name="serial">f56e3bc0-cc0a-49cb-8197-b90fda658edd</entry>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <entry name="uuid">f56e3bc0-cc0a-49cb-8197-b90fda658edd</entry>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </system>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <os>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   </os>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <features>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   </features>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk.config"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:de:b1:eb"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <target dev="tapd75dcb2b-f3"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/console.log" append="off"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <video>
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </video>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:53:31 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:53:31 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:53:31 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:53:31 compute-0 nova_compute[182725]: </domain>
Jan 22 22:53:31 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.098 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Preparing to wait for external event network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.099 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.099 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.100 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.101 182729 DEBUG nova.virt.libvirt.vif [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:53:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-231871896',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-231871896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=175,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBqeydfN17hZWjpgNkvtTTCheeYHHKdwSPpmwXNdhEz86TsUWZDN0tyralaf4RnRYw6P5/i8Jibda7fY8XZFO9/IMC66WzzHOcjPi7OrUjOoH49qfsv0+e2l/HBahE0rGA==',key_name='tempest-TestSecurityGroupsBasicOps-1137419110',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-ufu701mn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:53:23Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=f56e3bc0-cc0a-49cb-8197-b90fda658edd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.102 182729 DEBUG nova.network.os_vif_util [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.103 182729 DEBUG nova.network.os_vif_util [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:b1:eb,bridge_name='br-int',has_traffic_filtering=True,id=d75dcb2b-f35e-47f7-baba-92d2feff1bb7,network=Network(f56d94fc-8c7c-4266-8e36-c9ab040ac122),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd75dcb2b-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.103 182729 DEBUG os_vif [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:b1:eb,bridge_name='br-int',has_traffic_filtering=True,id=d75dcb2b-f35e-47f7-baba-92d2feff1bb7,network=Network(f56d94fc-8c7c-4266-8e36-c9ab040ac122),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd75dcb2b-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.104 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.105 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.106 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.110 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.111 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd75dcb2b-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.111 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd75dcb2b-f3, col_values=(('external_ids', {'iface-id': 'd75dcb2b-f35e-47f7-baba-92d2feff1bb7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:b1:eb', 'vm-uuid': 'f56e3bc0-cc0a-49cb-8197-b90fda658edd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:31 compute-0 NetworkManager[54954]: <info>  [1769122411.1146] manager: (tapd75dcb2b-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.114 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.117 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.120 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.121 182729 INFO os_vif [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:b1:eb,bridge_name='br-int',has_traffic_filtering=True,id=d75dcb2b-f35e-47f7-baba-92d2feff1bb7,network=Network(f56d94fc-8c7c-4266-8e36-c9ab040ac122),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd75dcb2b-f3')
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.184 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.186 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.186 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No VIF found with MAC fa:16:3e:de:b1:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.186 182729 INFO nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Using config drive
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.717 182729 INFO nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Creating config drive at /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk.config
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.727 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvs7h40en execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.877 182729 DEBUG oslo_concurrency.processutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvs7h40en" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:31 compute-0 kernel: tapd75dcb2b-f3: entered promiscuous mode
Jan 22 22:53:31 compute-0 NetworkManager[54954]: <info>  [1769122411.9668] manager: (tapd75dcb2b-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/333)
Jan 22 22:53:31 compute-0 ovn_controller[94850]: 2026-01-22T22:53:31Z|00714|binding|INFO|Claiming lport d75dcb2b-f35e-47f7-baba-92d2feff1bb7 for this chassis.
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.967 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:31 compute-0 ovn_controller[94850]: 2026-01-22T22:53:31Z|00715|binding|INFO|d75dcb2b-f35e-47f7-baba-92d2feff1bb7: Claiming fa:16:3e:de:b1:eb 10.100.0.11
Jan 22 22:53:31 compute-0 nova_compute[182725]: 2026-01-22 22:53:31.973 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:31.994 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:b1:eb 10.100.0.11'], port_security=['fa:16:3e:de:b1:eb 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f56e3bc0-cc0a-49cb-8197-b90fda658edd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2acf9528-c898-4fe9-925f-c76f04b00146 4fe7d98a-808f-4c92-aae1-0c89ec1859b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42652772-743d-4fa8-9d93-5074eeced63b, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=d75dcb2b-f35e-47f7-baba-92d2feff1bb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:53:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:31.995 104215 INFO neutron.agent.ovn.metadata.agent [-] Port d75dcb2b-f35e-47f7-baba-92d2feff1bb7 in datapath f56d94fc-8c7c-4266-8e36-c9ab040ac122 bound to our chassis
Jan 22 22:53:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:31.996 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f56d94fc-8c7c-4266-8e36-c9ab040ac122
Jan 22 22:53:31 compute-0 systemd-udevd[237924]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:53:32 compute-0 NetworkManager[54954]: <info>  [1769122412.0110] device (tapd75dcb2b-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:53:32 compute-0 NetworkManager[54954]: <info>  [1769122412.0116] device (tapd75dcb2b-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.013 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0303e96d-ddb0-46ba-b3cc-e2f95ba6e439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.014 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf56d94fc-81 in ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.016 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf56d94fc-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.016 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a60de84d-945a-46a7-8c03-141f7e5ccbc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.017 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8e884b96-a733-4df5-bd8d-6896ca2cf74a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 systemd-machined[154006]: New machine qemu-75-instance-000000af.
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.037 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[c0686402-282b-4d64-b301-fb5612b83ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.054 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:32 compute-0 ovn_controller[94850]: 2026-01-22T22:53:32Z|00716|binding|INFO|Setting lport d75dcb2b-f35e-47f7-baba-92d2feff1bb7 ovn-installed in OVS
Jan 22 22:53:32 compute-0 ovn_controller[94850]: 2026-01-22T22:53:32Z|00717|binding|INFO|Setting lport d75dcb2b-f35e-47f7-baba-92d2feff1bb7 up in Southbound
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.059 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:32 compute-0 systemd[1]: Started Virtual Machine qemu-75-instance-000000af.
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.065 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e8772c24-cbb5-4d6f-8c1d-56ce68dfc708]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.097 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[49e8b88c-14f2-49b6-8d04-9e0b90e48c32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 NetworkManager[54954]: <info>  [1769122412.1045] manager: (tapf56d94fc-80): new Veth device (/org/freedesktop/NetworkManager/Devices/334)
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.103 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8eb287-270b-46fa-852f-ebf6fa780136]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.142 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f7110c3d-1a95-4f1b-8e89-78604e0e7dfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.145 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[aab513f2-4ee0-4652-9f37-7711717ecc1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 NetworkManager[54954]: <info>  [1769122412.1659] device (tapf56d94fc-80): carrier: link connected
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.171 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f846ba42-dae5-4531-9673-ab60bf462523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.187 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8874569c-51f3-4b8c-91c4-660b079619a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf56d94fc-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:44:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604976, 'reachable_time': 32668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237960, 'error': None, 'target': 'ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.201 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[43ee30e7-d35f-4eda-bf00-28e81570c602]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:448c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 604976, 'tstamp': 604976}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237961, 'error': None, 'target': 'ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.219 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3501fa04-97b7-4b79-8e99-bf8c8cab661b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf56d94fc-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:44:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604976, 'reachable_time': 32668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237962, 'error': None, 'target': 'ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.252 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f6aeef12-41e0-43fa-b4fa-edab889ca44d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.325 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8f019c35-7f4a-4c31-a094-6bf3ea8cf77f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.327 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf56d94fc-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.327 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.328 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf56d94fc-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.329 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:32 compute-0 NetworkManager[54954]: <info>  [1769122412.3303] manager: (tapf56d94fc-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 22 22:53:32 compute-0 kernel: tapf56d94fc-80: entered promiscuous mode
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.333 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.333 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf56d94fc-80, col_values=(('external_ids', {'iface-id': '361e5b22-cca0-47f7-85e2-0c4191bd1975'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:32 compute-0 ovn_controller[94850]: 2026-01-22T22:53:32Z|00718|binding|INFO|Releasing lport 361e5b22-cca0-47f7-85e2-0c4191bd1975 from this chassis (sb_readonly=0)
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.337 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.361 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.362 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f56d94fc-8c7c-4266-8e36-c9ab040ac122.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f56d94fc-8c7c-4266-8e36-c9ab040ac122.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.363 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8de05b-1977-4283-8227-95b02a8dfe76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.364 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-f56d94fc-8c7c-4266-8e36-c9ab040ac122
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/f56d94fc-8c7c-4266-8e36-c9ab040ac122.pid.haproxy
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID f56d94fc-8c7c-4266-8e36-c9ab040ac122
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:53:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:32.365 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'env', 'PROCESS_TAG=haproxy-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f56d94fc-8c7c-4266-8e36-c9ab040ac122.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.474 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122412.4735446, f56e3bc0-cc0a-49cb-8197-b90fda658edd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.474 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] VM Started (Lifecycle Event)
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.502 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.507 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122412.4736397, f56e3bc0-cc0a-49cb-8197-b90fda658edd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.507 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] VM Paused (Lifecycle Event)
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.529 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.532 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:53:32 compute-0 nova_compute[182725]: 2026-01-22 22:53:32.549 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:53:32 compute-0 podman[238001]: 2026-01-22 22:53:32.730319928 +0000 UTC m=+0.054330192 container create e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 22:53:32 compute-0 systemd[1]: Started libpod-conmon-e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b.scope.
Jan 22 22:53:32 compute-0 podman[238001]: 2026-01-22 22:53:32.697204415 +0000 UTC m=+0.021214679 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:53:32 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:53:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0dc203a6d77c823c952be3b84dc494696c1673bc197700aadef0e1c690f80c85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:53:32 compute-0 podman[238001]: 2026-01-22 22:53:32.816191792 +0000 UTC m=+0.140202076 container init e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:53:32 compute-0 podman[238001]: 2026-01-22 22:53:32.821039572 +0000 UTC m=+0.145049816 container start e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:53:32 compute-0 neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122[238016]: [NOTICE]   (238020) : New worker (238022) forked
Jan 22 22:53:32 compute-0 neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122[238016]: [NOTICE]   (238020) : Loading success.
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.166 182729 DEBUG nova.network.neutron [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updated VIF entry in instance network info cache for port d75dcb2b-f35e-47f7-baba-92d2feff1bb7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.168 182729 DEBUG nova.network.neutron [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updating instance_info_cache with network_info: [{"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.193 182729 DEBUG oslo_concurrency.lockutils [req-97b89aed-8171-4ef4-bc82-1caf57ab18bd req-6cef2b12-f862-4743-8b1a-e66d9ac5c874 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.261 182729 DEBUG nova.compute.manager [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.262 182729 DEBUG oslo_concurrency.lockutils [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.262 182729 DEBUG oslo_concurrency.lockutils [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.263 182729 DEBUG oslo_concurrency.lockutils [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.263 182729 DEBUG nova.compute.manager [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Processing event network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.263 182729 DEBUG nova.compute.manager [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.264 182729 DEBUG oslo_concurrency.lockutils [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.264 182729 DEBUG oslo_concurrency.lockutils [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.264 182729 DEBUG oslo_concurrency.lockutils [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.265 182729 DEBUG nova.compute.manager [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] No waiting events found dispatching network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.265 182729 WARNING nova.compute.manager [req-ea9e69bf-0110-45e1-bf67-3a23d392c3d8 req-eff8c098-1019-434c-a02c-d183f5068210 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received unexpected event network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 for instance with vm_state building and task_state spawning.
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.266 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.271 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122413.2716506, f56e3bc0-cc0a-49cb-8197-b90fda658edd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.272 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] VM Resumed (Lifecycle Event)
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.275 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.278 182729 INFO nova.virt.libvirt.driver [-] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Instance spawned successfully.
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.278 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.298 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.307 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.310 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.310 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.311 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.311 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.312 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.312 182729 DEBUG nova.virt.libvirt.driver [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.340 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.434 182729 INFO nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Took 9.64 seconds to spawn the instance on the hypervisor.
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.434 182729 DEBUG nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.544 182729 INFO nova.compute.manager [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Took 10.54 seconds to build instance.
Jan 22 22:53:33 compute-0 nova_compute[182725]: 2026-01-22 22:53:33.566 182729 DEBUG oslo_concurrency.lockutils [None req-87d3b352-2693-4c3f-ab7c-9e13f5f3b555 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:34 compute-0 nova_compute[182725]: 2026-01-22 22:53:34.915 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:36 compute-0 nova_compute[182725]: 2026-01-22 22:53:36.115 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:36 compute-0 podman[238032]: 2026-01-22 22:53:36.15676036 +0000 UTC m=+0.068696129 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 22:53:36 compute-0 podman[238031]: 2026-01-22 22:53:36.190388105 +0000 UTC m=+0.112483156 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:53:36 compute-0 podman[238033]: 2026-01-22 22:53:36.190497848 +0000 UTC m=+0.108556789 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:53:39 compute-0 nova_compute[182725]: 2026-01-22 22:53:39.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:39 compute-0 nova_compute[182725]: 2026-01-22 22:53:39.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:53:39 compute-0 nova_compute[182725]: 2026-01-22 22:53:39.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:53:39 compute-0 nova_compute[182725]: 2026-01-22 22:53:39.916 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:40 compute-0 nova_compute[182725]: 2026-01-22 22:53:40.254 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:53:40 compute-0 nova_compute[182725]: 2026-01-22 22:53:40.255 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:53:40 compute-0 nova_compute[182725]: 2026-01-22 22:53:40.255 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:53:40 compute-0 nova_compute[182725]: 2026-01-22 22:53:40.256 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f56e3bc0-cc0a-49cb-8197-b90fda658edd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:53:40 compute-0 nova_compute[182725]: 2026-01-22 22:53:40.986 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:40 compute-0 NetworkManager[54954]: <info>  [1769122420.9876] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Jan 22 22:53:40 compute-0 NetworkManager[54954]: <info>  [1769122420.9883] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Jan 22 22:53:41 compute-0 nova_compute[182725]: 2026-01-22 22:53:41.061 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:41 compute-0 ovn_controller[94850]: 2026-01-22T22:53:41Z|00719|binding|INFO|Releasing lport 361e5b22-cca0-47f7-85e2-0c4191bd1975 from this chassis (sb_readonly=0)
Jan 22 22:53:41 compute-0 nova_compute[182725]: 2026-01-22 22:53:41.071 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:41 compute-0 nova_compute[182725]: 2026-01-22 22:53:41.171 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:41 compute-0 nova_compute[182725]: 2026-01-22 22:53:41.491 182729 DEBUG nova.compute.manager [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-changed-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:41 compute-0 nova_compute[182725]: 2026-01-22 22:53:41.492 182729 DEBUG nova.compute.manager [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Refreshing instance network info cache due to event network-changed-d75dcb2b-f35e-47f7-baba-92d2feff1bb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:53:41 compute-0 nova_compute[182725]: 2026-01-22 22:53:41.493 182729 DEBUG oslo_concurrency.lockutils [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.376 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updating instance_info_cache with network_info: [{"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.402 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.402 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.403 182729 DEBUG oslo_concurrency.lockutils [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.403 182729 DEBUG nova.network.neutron [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Refreshing network info cache for port d75dcb2b-f35e-47f7-baba-92d2feff1bb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.404 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.405 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.405 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.405 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.424 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.425 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.425 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.425 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.553 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.614 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.615 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.668 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.815 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.816 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5502MB free_disk=73.28865814208984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.816 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.817 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.911 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance f56e3bc0-cc0a-49cb-8197-b90fda658edd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.911 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.911 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.919 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.973 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:53:44 compute-0 nova_compute[182725]: 2026-01-22 22:53:44.991 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:53:45 compute-0 nova_compute[182725]: 2026-01-22 22:53:45.014 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:53:45 compute-0 nova_compute[182725]: 2026-01-22 22:53:45.015 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:45 compute-0 ovn_controller[94850]: 2026-01-22T22:53:45Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:de:b1:eb 10.100.0.11
Jan 22 22:53:45 compute-0 ovn_controller[94850]: 2026-01-22T22:53:45Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:de:b1:eb 10.100.0.11
Jan 22 22:53:46 compute-0 nova_compute[182725]: 2026-01-22 22:53:46.175 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:46 compute-0 nova_compute[182725]: 2026-01-22 22:53:46.499 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:46 compute-0 nova_compute[182725]: 2026-01-22 22:53:46.499 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:47 compute-0 nova_compute[182725]: 2026-01-22 22:53:47.430 182729 DEBUG nova.network.neutron [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updated VIF entry in instance network info cache for port d75dcb2b-f35e-47f7-baba-92d2feff1bb7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:53:47 compute-0 nova_compute[182725]: 2026-01-22 22:53:47.432 182729 DEBUG nova.network.neutron [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updating instance_info_cache with network_info: [{"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:53:47 compute-0 nova_compute[182725]: 2026-01-22 22:53:47.455 182729 DEBUG oslo_concurrency.lockutils [req-7714ac07-8ab1-4755-80a4-f6a1ee4c7bf4 req-9695f6ea-d242-4527-94ea-bcadf7787606 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:53:47 compute-0 nova_compute[182725]: 2026-01-22 22:53:47.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:53:47 compute-0 nova_compute[182725]: 2026-01-22 22:53:47.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:53:49 compute-0 podman[238118]: 2026-01-22 22:53:49.14125565 +0000 UTC m=+0.069309833 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 22:53:49 compute-0 nova_compute[182725]: 2026-01-22 22:53:49.920 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:51 compute-0 nova_compute[182725]: 2026-01-22 22:53:51.179 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:51 compute-0 ovn_controller[94850]: 2026-01-22T22:53:51Z|00720|binding|INFO|Releasing lport 361e5b22-cca0-47f7-85e2-0c4191bd1975 from this chassis (sb_readonly=0)
Jan 22 22:53:54 compute-0 nova_compute[182725]: 2026-01-22 22:53:54.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.183 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.334 182729 DEBUG nova.compute.manager [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-changed-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.335 182729 DEBUG nova.compute.manager [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Refreshing instance network info cache due to event network-changed-d75dcb2b-f35e-47f7-baba-92d2feff1bb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.335 182729 DEBUG oslo_concurrency.lockutils [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.335 182729 DEBUG oslo_concurrency.lockutils [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.335 182729 DEBUG nova.network.neutron [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Refreshing network info cache for port d75dcb2b-f35e-47f7-baba-92d2feff1bb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.424 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.424 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.424 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.424 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.425 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.435 182729 INFO nova.compute.manager [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Terminating instance
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.445 182729 DEBUG nova.compute.manager [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:53:56 compute-0 kernel: tapd75dcb2b-f3 (unregistering): left promiscuous mode
Jan 22 22:53:56 compute-0 NetworkManager[54954]: <info>  [1769122436.4697] device (tapd75dcb2b-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:53:56 compute-0 ovn_controller[94850]: 2026-01-22T22:53:56Z|00721|binding|INFO|Releasing lport d75dcb2b-f35e-47f7-baba-92d2feff1bb7 from this chassis (sb_readonly=0)
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.476 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 ovn_controller[94850]: 2026-01-22T22:53:56Z|00722|binding|INFO|Setting lport d75dcb2b-f35e-47f7-baba-92d2feff1bb7 down in Southbound
Jan 22 22:53:56 compute-0 ovn_controller[94850]: 2026-01-22T22:53:56Z|00723|binding|INFO|Removing iface tapd75dcb2b-f3 ovn-installed in OVS
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.479 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.489 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:b1:eb 10.100.0.11'], port_security=['fa:16:3e:de:b1:eb 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f56e3bc0-cc0a-49cb-8197-b90fda658edd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2acf9528-c898-4fe9-925f-c76f04b00146 4fe7d98a-808f-4c92-aae1-0c89ec1859b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42652772-743d-4fa8-9d93-5074eeced63b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=d75dcb2b-f35e-47f7-baba-92d2feff1bb7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.491 104215 INFO neutron.agent.ovn.metadata.agent [-] Port d75dcb2b-f35e-47f7-baba-92d2feff1bb7 in datapath f56d94fc-8c7c-4266-8e36-c9ab040ac122 unbound from our chassis
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.492 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f56d94fc-8c7c-4266-8e36-c9ab040ac122, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.494 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.494 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d52606b6-9040-4aae-ba79-9f3dfc5090d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.494 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122 namespace which is not needed anymore
Jan 22 22:53:56 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000af.scope: Deactivated successfully.
Jan 22 22:53:56 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000af.scope: Consumed 12.480s CPU time.
Jan 22 22:53:56 compute-0 systemd-machined[154006]: Machine qemu-75-instance-000000af terminated.
Jan 22 22:53:56 compute-0 podman[238139]: 2026-01-22 22:53:56.600773251 +0000 UTC m=+0.107184015 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 22:53:56 compute-0 podman[238143]: 2026-01-22 22:53:56.608590176 +0000 UTC m=+0.109841411 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 22:53:56 compute-0 neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122[238016]: [NOTICE]   (238020) : haproxy version is 2.8.14-c23fe91
Jan 22 22:53:56 compute-0 neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122[238016]: [NOTICE]   (238020) : path to executable is /usr/sbin/haproxy
Jan 22 22:53:56 compute-0 neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122[238016]: [WARNING]  (238020) : Exiting Master process...
Jan 22 22:53:56 compute-0 neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122[238016]: [ALERT]    (238020) : Current worker (238022) exited with code 143 (Terminated)
Jan 22 22:53:56 compute-0 neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122[238016]: [WARNING]  (238020) : All workers exited. Exiting... (0)
Jan 22 22:53:56 compute-0 systemd[1]: libpod-e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b.scope: Deactivated successfully.
Jan 22 22:53:56 compute-0 podman[238205]: 2026-01-22 22:53:56.659717297 +0000 UTC m=+0.041349639 container died e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 22:53:56 compute-0 NetworkManager[54954]: <info>  [1769122436.6654] manager: (tapd75dcb2b-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.668 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.672 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b-userdata-shm.mount: Deactivated successfully.
Jan 22 22:53:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-0dc203a6d77c823c952be3b84dc494696c1673bc197700aadef0e1c690f80c85-merged.mount: Deactivated successfully.
Jan 22 22:53:56 compute-0 podman[238205]: 2026-01-22 22:53:56.70051049 +0000 UTC m=+0.082142822 container cleanup e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:53:56 compute-0 systemd[1]: libpod-conmon-e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b.scope: Deactivated successfully.
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.714 182729 INFO nova.virt.libvirt.driver [-] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Instance destroyed successfully.
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.714 182729 DEBUG nova.objects.instance [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'resources' on Instance uuid f56e3bc0-cc0a-49cb-8197-b90fda658edd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.738 182729 DEBUG nova.virt.libvirt.vif [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:53:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-231871896',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-231871896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=175,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBqeydfN17hZWjpgNkvtTTCheeYHHKdwSPpmwXNdhEz86TsUWZDN0tyralaf4RnRYw6P5/i8Jibda7fY8XZFO9/IMC66WzzHOcjPi7OrUjOoH49qfsv0+e2l/HBahE0rGA==',key_name='tempest-TestSecurityGroupsBasicOps-1137419110',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:53:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-ufu701mn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:53:33Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=f56e3bc0-cc0a-49cb-8197-b90fda658edd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.738 182729 DEBUG nova.network.os_vif_util [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.739 182729 DEBUG nova.network.os_vif_util [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:de:b1:eb,bridge_name='br-int',has_traffic_filtering=True,id=d75dcb2b-f35e-47f7-baba-92d2feff1bb7,network=Network(f56d94fc-8c7c-4266-8e36-c9ab040ac122),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd75dcb2b-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.739 182729 DEBUG os_vif [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:b1:eb,bridge_name='br-int',has_traffic_filtering=True,id=d75dcb2b-f35e-47f7-baba-92d2feff1bb7,network=Network(f56d94fc-8c7c-4266-8e36-c9ab040ac122),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd75dcb2b-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.740 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.741 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd75dcb2b-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.742 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.746 182729 INFO os_vif [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:b1:eb,bridge_name='br-int',has_traffic_filtering=True,id=d75dcb2b-f35e-47f7-baba-92d2feff1bb7,network=Network(f56d94fc-8c7c-4266-8e36-c9ab040ac122),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd75dcb2b-f3')
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.746 182729 INFO nova.virt.libvirt.driver [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Deleting instance files /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd_del
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.747 182729 INFO nova.virt.libvirt.driver [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Deletion of /var/lib/nova/instances/f56e3bc0-cc0a-49cb-8197-b90fda658edd_del complete
Jan 22 22:53:56 compute-0 podman[238248]: 2026-01-22 22:53:56.760702607 +0000 UTC m=+0.038496388 container remove e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.765 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e7b6f90d-1924-43e7-a88e-28c39b88bb3e]: (4, ('Thu Jan 22 10:53:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122 (e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b)\ne42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b\nThu Jan 22 10:53:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122 (e42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b)\ne42281289c5c8b1649b9775f0e1b7aca156207193d86db743b0d04ed9af7d19b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.766 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[32cc0694-e6d8-41a7-b72d-4f487e58d1f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.767 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf56d94fc-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.769 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 kernel: tapf56d94fc-80: left promiscuous mode
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.780 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.783 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[388e0e31-935e-47db-a6a4-7ce31882b474]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.799 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[bed05545-c45f-4cab-b4fe-b8adbd7b48f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.800 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8e9203f0-4733-49ae-b5b1-c04f093cb5b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.815 182729 DEBUG nova.compute.manager [req-1abe3040-0a7c-46df-8449-04c86572e2bc req-6ba37ad3-5500-44e3-97df-44ac265e2d09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-vif-unplugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.815 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad652bc-5f99-4300-9ba1-6a3b3fcca995]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604969, 'reachable_time': 34244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238263, 'error': None, 'target': 'ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.815 182729 DEBUG oslo_concurrency.lockutils [req-1abe3040-0a7c-46df-8449-04c86572e2bc req-6ba37ad3-5500-44e3-97df-44ac265e2d09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.816 182729 DEBUG oslo_concurrency.lockutils [req-1abe3040-0a7c-46df-8449-04c86572e2bc req-6ba37ad3-5500-44e3-97df-44ac265e2d09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.816 182729 DEBUG oslo_concurrency.lockutils [req-1abe3040-0a7c-46df-8449-04c86572e2bc req-6ba37ad3-5500-44e3-97df-44ac265e2d09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.816 182729 DEBUG nova.compute.manager [req-1abe3040-0a7c-46df-8449-04c86572e2bc req-6ba37ad3-5500-44e3-97df-44ac265e2d09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] No waiting events found dispatching network-vif-unplugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.816 182729 DEBUG nova.compute.manager [req-1abe3040-0a7c-46df-8449-04c86572e2bc req-6ba37ad3-5500-44e3-97df-44ac265e2d09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-vif-unplugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:53:56 compute-0 systemd[1]: run-netns-ovnmeta\x2df56d94fc\x2d8c7c\x2d4266\x2d8e36\x2dc9ab040ac122.mount: Deactivated successfully.
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.819 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f56d94fc-8c7c-4266-8e36-c9ab040ac122 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:53:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:53:56.819 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2f8fd2-750e-4904-b19f-ad00e8b113c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.848 182729 INFO nova.compute.manager [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.849 182729 DEBUG oslo.service.loopingcall [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.850 182729 DEBUG nova.compute.manager [-] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:53:56 compute-0 nova_compute[182725]: 2026-01-22 22:53:56.850 182729 DEBUG nova.network.neutron [-] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.644 182729 DEBUG nova.network.neutron [-] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.669 182729 INFO nova.compute.manager [-] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Took 1.82 seconds to deallocate network for instance.
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.762 182729 DEBUG nova.compute.manager [req-f09bf9d9-6a7e-4140-bfb5-f2fa72c65de4 req-41314229-6965-45a1-9233-c42b22038305 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-vif-deleted-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.822 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.822 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.892 182729 DEBUG nova.compute.provider_tree [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.901 182729 DEBUG nova.network.neutron [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updated VIF entry in instance network info cache for port d75dcb2b-f35e-47f7-baba-92d2feff1bb7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.901 182729 DEBUG nova.network.neutron [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Updating instance_info_cache with network_info: [{"id": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "address": "fa:16:3e:de:b1:eb", "network": {"id": "f56d94fc-8c7c-4266-8e36-c9ab040ac122", "bridge": "br-int", "label": "tempest-network-smoke--1761889706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd75dcb2b-f3", "ovs_interfaceid": "d75dcb2b-f35e-47f7-baba-92d2feff1bb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.913 182729 DEBUG nova.scheduler.client.report [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.933 182729 DEBUG oslo_concurrency.lockutils [req-045e05bf-c57e-4898-9e4c-b4c576ff167f req-9f2bb3e1-ff3d-455c-afc2-61b35c875794 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f56e3bc0-cc0a-49cb-8197-b90fda658edd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.942 182729 DEBUG nova.compute.manager [req-1c45183c-8f8b-4903-b5dd-fd72336d65ec req-40aedb16-f942-42d8-96d4-f68f88de0c7e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received event network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.943 182729 DEBUG oslo_concurrency.lockutils [req-1c45183c-8f8b-4903-b5dd-fd72336d65ec req-40aedb16-f942-42d8-96d4-f68f88de0c7e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.943 182729 DEBUG oslo_concurrency.lockutils [req-1c45183c-8f8b-4903-b5dd-fd72336d65ec req-40aedb16-f942-42d8-96d4-f68f88de0c7e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.943 182729 DEBUG oslo_concurrency.lockutils [req-1c45183c-8f8b-4903-b5dd-fd72336d65ec req-40aedb16-f942-42d8-96d4-f68f88de0c7e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.943 182729 DEBUG nova.compute.manager [req-1c45183c-8f8b-4903-b5dd-fd72336d65ec req-40aedb16-f942-42d8-96d4-f68f88de0c7e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] No waiting events found dispatching network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.944 182729 WARNING nova.compute.manager [req-1c45183c-8f8b-4903-b5dd-fd72336d65ec req-40aedb16-f942-42d8-96d4-f68f88de0c7e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Received unexpected event network-vif-plugged-d75dcb2b-f35e-47f7-baba-92d2feff1bb7 for instance with vm_state deleted and task_state None.
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.945 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:58 compute-0 nova_compute[182725]: 2026-01-22 22:53:58.999 182729 INFO nova.scheduler.client.report [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Deleted allocations for instance f56e3bc0-cc0a-49cb-8197-b90fda658edd
Jan 22 22:53:59 compute-0 nova_compute[182725]: 2026-01-22 22:53:59.101 182729 DEBUG oslo_concurrency.lockutils [None req-e8a7979e-b8f5-4984-a90b-64448bb35e0c 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "f56e3bc0-cc0a-49cb-8197-b90fda658edd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:53:59 compute-0 nova_compute[182725]: 2026-01-22 22:53:59.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:00 compute-0 nova_compute[182725]: 2026-01-22 22:54:00.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:01 compute-0 nova_compute[182725]: 2026-01-22 22:54:01.744 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:04 compute-0 nova_compute[182725]: 2026-01-22 22:54:04.996 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:06 compute-0 nova_compute[182725]: 2026-01-22 22:54:06.273 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:06 compute-0 nova_compute[182725]: 2026-01-22 22:54:06.345 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:06 compute-0 nova_compute[182725]: 2026-01-22 22:54:06.746 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:07 compute-0 podman[238267]: 2026-01-22 22:54:07.167560193 +0000 UTC m=+0.091941446 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 22 22:54:07 compute-0 podman[238266]: 2026-01-22 22:54:07.171977613 +0000 UTC m=+0.095596337 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:54:07 compute-0 podman[238268]: 2026-01-22 22:54:07.191128379 +0000 UTC m=+0.108074817 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:54:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:54:09 compute-0 nova_compute[182725]: 2026-01-22 22:54:09.998 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:11 compute-0 nova_compute[182725]: 2026-01-22 22:54:11.713 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122436.7122922, f56e3bc0-cc0a-49cb-8197-b90fda658edd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:54:11 compute-0 nova_compute[182725]: 2026-01-22 22:54:11.714 182729 INFO nova.compute.manager [-] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] VM Stopped (Lifecycle Event)
Jan 22 22:54:11 compute-0 nova_compute[182725]: 2026-01-22 22:54:11.733 182729 DEBUG nova.compute.manager [None req-b776df4e-47db-4037-96ab-b2f8b9961e30 - - - - - -] [instance: f56e3bc0-cc0a-49cb-8197-b90fda658edd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:54:11 compute-0 nova_compute[182725]: 2026-01-22 22:54:11.750 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:12.464 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:12.466 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:12.467 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:15 compute-0 nova_compute[182725]: 2026-01-22 22:54:15.001 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:15 compute-0 systemd[1]: Starting dnf makecache...
Jan 22 22:54:15 compute-0 dnf[238331]: Metadata cache refreshed recently.
Jan 22 22:54:15 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 22:54:15 compute-0 systemd[1]: Finished dnf makecache.
Jan 22 22:54:16 compute-0 nova_compute[182725]: 2026-01-22 22:54:16.752 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:20 compute-0 nova_compute[182725]: 2026-01-22 22:54:20.036 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:20 compute-0 podman[238332]: 2026-01-22 22:54:20.136722794 +0000 UTC m=+0.071281742 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 22 22:54:21 compute-0 nova_compute[182725]: 2026-01-22 22:54:21.755 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:21 compute-0 nova_compute[182725]: 2026-01-22 22:54:21.920 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:21 compute-0 nova_compute[182725]: 2026-01-22 22:54:21.921 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:21 compute-0 nova_compute[182725]: 2026-01-22 22:54:21.938 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.058 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.059 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.069 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.069 182729 INFO nova.compute.claims [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.241 182729 DEBUG nova.compute.provider_tree [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.267 182729 DEBUG nova.scheduler.client.report [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.312 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.313 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.379 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.380 182729 DEBUG nova.network.neutron [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.416 182729 INFO nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.450 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.580 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.581 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.582 182729 INFO nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Creating image(s)
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.582 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "/var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.582 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.583 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.595 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.649 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.650 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.651 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.673 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.727 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.729 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.775 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.776 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.776 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.829 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.830 182729 DEBUG nova.virt.disk.api [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Checking if we can resize image /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.830 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.886 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.887 182729 DEBUG nova.virt.disk.api [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Cannot resize image /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.887 182729 DEBUG nova.objects.instance [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'migration_context' on Instance uuid 2d5faa8e-c258-40cc-85a4-fa95e31593c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.911 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.911 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Ensure instance console log exists: /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.911 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.912 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:22 compute-0 nova_compute[182725]: 2026-01-22 22:54:22.912 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:23 compute-0 nova_compute[182725]: 2026-01-22 22:54:23.276 182729 DEBUG nova.policy [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:54:25 compute-0 nova_compute[182725]: 2026-01-22 22:54:25.038 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:26 compute-0 nova_compute[182725]: 2026-01-22 22:54:26.109 182729 DEBUG nova.network.neutron [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Successfully created port: 964c5203-3352-4645-a500-ff3ce1e8d508 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:54:26 compute-0 nova_compute[182725]: 2026-01-22 22:54:26.759 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:27 compute-0 podman[238367]: 2026-01-22 22:54:27.154732933 +0000 UTC m=+0.086858480 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:54:27 compute-0 podman[238368]: 2026-01-22 22:54:27.154900737 +0000 UTC m=+0.086744047 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc.)
Jan 22 22:54:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:27.643 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2 2001:db8::f816:3eff:fecb:140a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fecb:140a/64', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3589f7d-c75d-4b83-bcd4-26c17c46dcf1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0ee482bf-08e3-4c08-b19d-3799def66c4e) old=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:54:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:27.645 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0ee482bf-08e3-4c08-b19d-3799def66c4e in datapath 0dcb11b3-4f88-477e-8e29-469839246ce6 updated
Jan 22 22:54:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:27.646 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0dcb11b3-4f88-477e-8e29-469839246ce6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:54:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:27.647 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[683f64b6-8a67-4d76-bfc6-af9c74662019]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.008 182729 DEBUG nova.network.neutron [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Successfully updated port: 964c5203-3352-4645-a500-ff3ce1e8d508 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.031 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.032 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquired lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.032 182729 DEBUG nova.network.neutron [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.146 182729 DEBUG nova.compute.manager [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-changed-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.147 182729 DEBUG nova.compute.manager [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Refreshing instance network info cache due to event network-changed-964c5203-3352-4645-a500-ff3ce1e8d508. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.147 182729 DEBUG oslo_concurrency.lockutils [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:54:28 compute-0 nova_compute[182725]: 2026-01-22 22:54:28.233 182729 DEBUG nova.network.neutron [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.386 182729 DEBUG nova.network.neutron [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updating instance_info_cache with network_info: [{"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.411 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Releasing lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.412 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Instance network_info: |[{"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.413 182729 DEBUG oslo_concurrency.lockutils [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.413 182729 DEBUG nova.network.neutron [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Refreshing network info cache for port 964c5203-3352-4645-a500-ff3ce1e8d508 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.419 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Start _get_guest_xml network_info=[{"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.424 182729 WARNING nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.433 182729 DEBUG nova.virt.libvirt.host [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.434 182729 DEBUG nova.virt.libvirt.host [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.443 182729 DEBUG nova.virt.libvirt.host [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.444 182729 DEBUG nova.virt.libvirt.host [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.446 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.446 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.447 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.447 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.448 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.448 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.449 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.449 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.450 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.450 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.451 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.451 182729 DEBUG nova.virt.hardware [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.457 182729 DEBUG nova.virt.libvirt.vif [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:54:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2146779521',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2146779521',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=176,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF3bM7NQGtG5yriXbrRucGgbXDO+Ih+16YwqKkCqi6iYV70uP3NcZajayx6+zddPMbVIGqZDPRroiJyEP2VGI5ncm7A4UPQ1aQzuh23PRRclUINeQtZ2TOXHx39xQlJ4JA==',key_name='tempest-TestSecurityGroupsBasicOps-227428177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-apphhjiq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:54:22Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=2d5faa8e-c258-40cc-85a4-fa95e31593c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.458 182729 DEBUG nova.network.os_vif_util [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.459 182729 DEBUG nova.network.os_vif_util [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:30:f4,bridge_name='br-int',has_traffic_filtering=True,id=964c5203-3352-4645-a500-ff3ce1e8d508,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap964c5203-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.460 182729 DEBUG nova.objects.instance [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'pci_devices' on Instance uuid 2d5faa8e-c258-40cc-85a4-fa95e31593c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.478 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <uuid>2d5faa8e-c258-40cc-85a4-fa95e31593c2</uuid>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <name>instance-000000b0</name>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2146779521</nova:name>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:54:29</nova:creationTime>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:user uuid="57cadc74575048b298f2ab431b92531e">tempest-TestSecurityGroupsBasicOps-838015615-project-member</nova:user>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:project uuid="bbcf23c8115e43a0af378f72b41c2f1b">tempest-TestSecurityGroupsBasicOps-838015615</nova:project>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         <nova:port uuid="964c5203-3352-4645-a500-ff3ce1e8d508">
Jan 22 22:54:29 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <system>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <entry name="serial">2d5faa8e-c258-40cc-85a4-fa95e31593c2</entry>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <entry name="uuid">2d5faa8e-c258-40cc-85a4-fa95e31593c2</entry>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </system>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <os>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   </os>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <features>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   </features>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk.config"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:98:30:f4"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <target dev="tap964c5203-33"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/console.log" append="off"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <video>
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </video>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:54:29 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:54:29 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:54:29 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:54:29 compute-0 nova_compute[182725]: </domain>
Jan 22 22:54:29 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.480 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Preparing to wait for external event network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.481 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.481 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.481 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.481 182729 DEBUG nova.virt.libvirt.vif [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:54:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2146779521',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2146779521',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=176,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF3bM7NQGtG5yriXbrRucGgbXDO+Ih+16YwqKkCqi6iYV70uP3NcZajayx6+zddPMbVIGqZDPRroiJyEP2VGI5ncm7A4UPQ1aQzuh23PRRclUINeQtZ2TOXHx39xQlJ4JA==',key_name='tempest-TestSecurityGroupsBasicOps-227428177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-apphhjiq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:54:22Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=2d5faa8e-c258-40cc-85a4-fa95e31593c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.482 182729 DEBUG nova.network.os_vif_util [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.482 182729 DEBUG nova.network.os_vif_util [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:30:f4,bridge_name='br-int',has_traffic_filtering=True,id=964c5203-3352-4645-a500-ff3ce1e8d508,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap964c5203-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.482 182729 DEBUG os_vif [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:30:f4,bridge_name='br-int',has_traffic_filtering=True,id=964c5203-3352-4645-a500-ff3ce1e8d508,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap964c5203-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.483 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.483 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.483 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.486 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.486 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap964c5203-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.487 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap964c5203-33, col_values=(('external_ids', {'iface-id': '964c5203-3352-4645-a500-ff3ce1e8d508', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:30:f4', 'vm-uuid': '2d5faa8e-c258-40cc-85a4-fa95e31593c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.488 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:29 compute-0 NetworkManager[54954]: <info>  [1769122469.4899] manager: (tap964c5203-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.490 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.497 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.497 182729 INFO os_vif [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:30:f4,bridge_name='br-int',has_traffic_filtering=True,id=964c5203-3352-4645-a500-ff3ce1e8d508,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap964c5203-33')
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.581 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.581 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.581 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No VIF found with MAC fa:16:3e:98:30:f4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:54:29 compute-0 nova_compute[182725]: 2026-01-22 22:54:29.582 182729 INFO nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Using config drive
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.040 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.412 182729 INFO nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Creating config drive at /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk.config
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.418 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprvhjel1f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.552 182729 DEBUG oslo_concurrency.processutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprvhjel1f" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:30 compute-0 kernel: tap964c5203-33: entered promiscuous mode
Jan 22 22:54:30 compute-0 NetworkManager[54954]: <info>  [1769122470.6425] manager: (tap964c5203-33): new Tun device (/org/freedesktop/NetworkManager/Devices/340)
Jan 22 22:54:30 compute-0 ovn_controller[94850]: 2026-01-22T22:54:30Z|00724|binding|INFO|Claiming lport 964c5203-3352-4645-a500-ff3ce1e8d508 for this chassis.
Jan 22 22:54:30 compute-0 ovn_controller[94850]: 2026-01-22T22:54:30Z|00725|binding|INFO|964c5203-3352-4645-a500-ff3ce1e8d508: Claiming fa:16:3e:98:30:f4 10.100.0.12
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.651 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.654 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.666 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:30:f4 10.100.0.12'], port_security=['fa:16:3e:98:30:f4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2d5faa8e-c258-40cc-85a4-fa95e31593c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1ed5133b-95fa-4539-885a-e8aa5db43dd3 77169cf9-4ddb-4a48-a907-fa4abc0d69fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd7cbcb5-c231-44bb-be1c-c0898fbee74d, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=964c5203-3352-4645-a500-ff3ce1e8d508) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.666 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 964c5203-3352-4645-a500-ff3ce1e8d508 in datapath 930b9b12-ffcc-452a-86e1-0321bc77aa71 bound to our chassis
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.667 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 930b9b12-ffcc-452a-86e1-0321bc77aa71
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.679 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[25b0f51b-ce7b-4bcc-92c5-5e8be0ef5a48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.680 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap930b9b12-f1 in ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.681 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap930b9b12-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.681 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[da3b4d51-e2e0-4e21-9558-b88477bf394f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.682 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3617e408-985f-4a52-bba5-18d8a894c69d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 systemd-udevd[238437]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:54:30 compute-0 systemd-machined[154006]: New machine qemu-76-instance-000000b0.
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.698 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[52f08764-6dbf-48dc-99da-09dea05c8e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 NetworkManager[54954]: <info>  [1769122470.7033] device (tap964c5203-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:54:30 compute-0 NetworkManager[54954]: <info>  [1769122470.7038] device (tap964c5203-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.708 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 ovn_controller[94850]: 2026-01-22T22:54:30Z|00726|binding|INFO|Setting lport 964c5203-3352-4645-a500-ff3ce1e8d508 ovn-installed in OVS
Jan 22 22:54:30 compute-0 ovn_controller[94850]: 2026-01-22T22:54:30Z|00727|binding|INFO|Setting lport 964c5203-3352-4645-a500-ff3ce1e8d508 up in Southbound
Jan 22 22:54:30 compute-0 systemd[1]: Started Virtual Machine qemu-76-instance-000000b0.
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.714 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.722 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1283173c-59d6-4874-8d38-722bdc29ba7f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.749 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[728d869e-38b6-4166-9a89-71885129667f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.753 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d4b1c2-1f0b-42a6-9544-f6ede2ea0300]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 NetworkManager[54954]: <info>  [1769122470.7543] manager: (tap930b9b12-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/341)
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.784 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[82256637-14f2-4e5b-955c-7e034619459f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.788 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[69168ff5-7fb2-4fae-948d-c197f3957d00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 NetworkManager[54954]: <info>  [1769122470.8118] device (tap930b9b12-f0): carrier: link connected
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.817 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[5d964c47-4b41-45a2-a2e0-458343222dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.836 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9bedb580-55eb-4e9a-a8df-651dc095df13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap930b9b12-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:93:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610841, 'reachable_time': 35766, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238469, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.852 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[007ca843-827e-45b0-821d-b766400841b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:9313'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610841, 'tstamp': 610841}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238470, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.867 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d666f41e-6a62-4463-8cc0-a5034ea2c9b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap930b9b12-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:93:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610841, 'reachable_time': 35766, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238471, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.896 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[df877ab1-c798-4e73-aad3-83dce8cd4276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.951 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[26c9232d-7562-4bc9-ab88-0e4b5de36aba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.952 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap930b9b12-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.952 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.952 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap930b9b12-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.954 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 NetworkManager[54954]: <info>  [1769122470.9551] manager: (tap930b9b12-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 22 22:54:30 compute-0 kernel: tap930b9b12-f0: entered promiscuous mode
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.958 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.959 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap930b9b12-f0, col_values=(('external_ids', {'iface-id': '5422293b-7f51-474a-850b-2710ef12aac0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.960 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 ovn_controller[94850]: 2026-01-22T22:54:30Z|00728|binding|INFO|Releasing lport 5422293b-7f51-474a-850b-2710ef12aac0 from this chassis (sb_readonly=0)
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.962 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/930b9b12-ffcc-452a-86e1-0321bc77aa71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/930b9b12-ffcc-452a-86e1-0321bc77aa71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.963 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5721412f-3143-4441-9c2c-b7eba213ad5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.963 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-930b9b12-ffcc-452a-86e1-0321bc77aa71
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/930b9b12-ffcc-452a-86e1-0321bc77aa71.pid.haproxy
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 930b9b12-ffcc-452a-86e1-0321bc77aa71
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:54:30 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:30.964 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'env', 'PROCESS_TAG=haproxy-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/930b9b12-ffcc-452a-86e1-0321bc77aa71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:54:30 compute-0 nova_compute[182725]: 2026-01-22 22:54:30.972 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.263 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:31.264 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:54:31 compute-0 podman[238503]: 2026-01-22 22:54:31.353453351 +0000 UTC m=+0.073152880 container create ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 22:54:31 compute-0 systemd[1]: Started libpod-conmon-ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7.scope.
Jan 22 22:54:31 compute-0 podman[238503]: 2026-01-22 22:54:31.315438396 +0000 UTC m=+0.035137975 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.414 182729 DEBUG nova.compute.manager [req-ab51aa07-cccd-444d-aa7b-cdc0aa3312eb req-81f206c2-8542-438e-89aa-87e9ddfe34fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.415 182729 DEBUG oslo_concurrency.lockutils [req-ab51aa07-cccd-444d-aa7b-cdc0aa3312eb req-81f206c2-8542-438e-89aa-87e9ddfe34fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.415 182729 DEBUG oslo_concurrency.lockutils [req-ab51aa07-cccd-444d-aa7b-cdc0aa3312eb req-81f206c2-8542-438e-89aa-87e9ddfe34fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.416 182729 DEBUG oslo_concurrency.lockutils [req-ab51aa07-cccd-444d-aa7b-cdc0aa3312eb req-81f206c2-8542-438e-89aa-87e9ddfe34fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.416 182729 DEBUG nova.compute.manager [req-ab51aa07-cccd-444d-aa7b-cdc0aa3312eb req-81f206c2-8542-438e-89aa-87e9ddfe34fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Processing event network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:54:31 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:54:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd81dd37ae966979601cf75e6bc5a9d457bbf583957991384209c139b5ccc56c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:54:31 compute-0 podman[238503]: 2026-01-22 22:54:31.449868047 +0000 UTC m=+0.169567596 container init ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:54:31 compute-0 podman[238503]: 2026-01-22 22:54:31.461756862 +0000 UTC m=+0.181456391 container start ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:54:31 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [NOTICE]   (238529) : New worker (238532) forked
Jan 22 22:54:31 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [NOTICE]   (238529) : Loading success.
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.537 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122471.5369608, 2d5faa8e-c258-40cc-85a4-fa95e31593c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.538 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] VM Started (Lifecycle Event)
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.542 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:54:31 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:31.543 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.545 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.548 182729 INFO nova.virt.libvirt.driver [-] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Instance spawned successfully.
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.548 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.559 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.563 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.567 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.567 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.567 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.568 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.568 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.569 182729 DEBUG nova.virt.libvirt.driver [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.618 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.619 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122471.5377374, 2d5faa8e-c258-40cc-85a4-fa95e31593c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.619 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] VM Paused (Lifecycle Event)
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.638 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.641 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122471.5441504, 2d5faa8e-c258-40cc-85a4-fa95e31593c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.642 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] VM Resumed (Lifecycle Event)
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.666 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.669 182729 INFO nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Took 9.09 seconds to spawn the instance on the hypervisor.
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.669 182729 DEBUG nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.673 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.700 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.749 182729 INFO nova.compute.manager [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Took 9.75 seconds to build instance.
Jan 22 22:54:31 compute-0 nova_compute[182725]: 2026-01-22 22:54:31.779 182729 DEBUG oslo_concurrency.lockutils [None req-a4dc743f-533a-4239-94d9-9ee44e83378e 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:32 compute-0 nova_compute[182725]: 2026-01-22 22:54:32.523 182729 DEBUG nova.network.neutron [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updated VIF entry in instance network info cache for port 964c5203-3352-4645-a500-ff3ce1e8d508. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:54:32 compute-0 nova_compute[182725]: 2026-01-22 22:54:32.525 182729 DEBUG nova.network.neutron [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updating instance_info_cache with network_info: [{"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:54:32 compute-0 nova_compute[182725]: 2026-01-22 22:54:32.542 182729 DEBUG oslo_concurrency.lockutils [req-bd80b38e-0e6e-4307-aa9b-438c334c9e00 req-6d4e1b99-ccbf-4907-b3c9-1e970c8e66e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:54:32 compute-0 nova_compute[182725]: 2026-01-22 22:54:32.905 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:33 compute-0 nova_compute[182725]: 2026-01-22 22:54:33.524 182729 DEBUG nova.compute.manager [req-4002a86d-088f-4803-b585-81e4d6f5379c req-dfc6db2f-d6ae-4d7a-8cf6-ec8b51f96791 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:54:33 compute-0 nova_compute[182725]: 2026-01-22 22:54:33.524 182729 DEBUG oslo_concurrency.lockutils [req-4002a86d-088f-4803-b585-81e4d6f5379c req-dfc6db2f-d6ae-4d7a-8cf6-ec8b51f96791 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:33 compute-0 nova_compute[182725]: 2026-01-22 22:54:33.524 182729 DEBUG oslo_concurrency.lockutils [req-4002a86d-088f-4803-b585-81e4d6f5379c req-dfc6db2f-d6ae-4d7a-8cf6-ec8b51f96791 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:33 compute-0 nova_compute[182725]: 2026-01-22 22:54:33.525 182729 DEBUG oslo_concurrency.lockutils [req-4002a86d-088f-4803-b585-81e4d6f5379c req-dfc6db2f-d6ae-4d7a-8cf6-ec8b51f96791 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:33 compute-0 nova_compute[182725]: 2026-01-22 22:54:33.525 182729 DEBUG nova.compute.manager [req-4002a86d-088f-4803-b585-81e4d6f5379c req-dfc6db2f-d6ae-4d7a-8cf6-ec8b51f96791 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] No waiting events found dispatching network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:54:33 compute-0 nova_compute[182725]: 2026-01-22 22:54:33.525 182729 WARNING nova.compute.manager [req-4002a86d-088f-4803-b585-81e4d6f5379c req-dfc6db2f-d6ae-4d7a-8cf6-ec8b51f96791 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received unexpected event network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 for instance with vm_state active and task_state None.
Jan 22 22:54:33 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:33.546 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:54:34 compute-0 nova_compute[182725]: 2026-01-22 22:54:34.491 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:35 compute-0 nova_compute[182725]: 2026-01-22 22:54:35.044 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:35.189 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2 2001:db8:0:1:f816:3eff:fecb:140a 2001:db8::f816:3eff:fecb:140a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fecb:140a/64 2001:db8::f816:3eff:fecb:140a/64', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3589f7d-c75d-4b83-bcd4-26c17c46dcf1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0ee482bf-08e3-4c08-b19d-3799def66c4e) old=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2 2001:db8::f816:3eff:fecb:140a'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fecb:140a/64', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:54:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:35.191 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0ee482bf-08e3-4c08-b19d-3799def66c4e in datapath 0dcb11b3-4f88-477e-8e29-469839246ce6 updated
Jan 22 22:54:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:35.193 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0dcb11b3-4f88-477e-8e29-469839246ce6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:54:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:54:35.194 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[67d7149b-3332-440a-be4e-4b5fbc845d97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.466 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:36 compute-0 NetworkManager[54954]: <info>  [1769122476.4696] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 22 22:54:36 compute-0 NetworkManager[54954]: <info>  [1769122476.4724] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.529 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:36 compute-0 ovn_controller[94850]: 2026-01-22T22:54:36Z|00729|binding|INFO|Releasing lport 5422293b-7f51-474a-850b-2710ef12aac0 from this chassis (sb_readonly=0)
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.544 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.957 182729 DEBUG nova.compute.manager [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-changed-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.957 182729 DEBUG nova.compute.manager [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Refreshing instance network info cache due to event network-changed-964c5203-3352-4645-a500-ff3ce1e8d508. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.958 182729 DEBUG oslo_concurrency.lockutils [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.959 182729 DEBUG oslo_concurrency.lockutils [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:54:36 compute-0 nova_compute[182725]: 2026-01-22 22:54:36.959 182729 DEBUG nova.network.neutron [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Refreshing network info cache for port 964c5203-3352-4645-a500-ff3ce1e8d508 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:54:38 compute-0 podman[238542]: 2026-01-22 22:54:38.12607051 +0000 UTC m=+0.059813497 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:54:38 compute-0 podman[238544]: 2026-01-22 22:54:38.133586517 +0000 UTC m=+0.061314135 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:54:38 compute-0 podman[238543]: 2026-01-22 22:54:38.133931056 +0000 UTC m=+0.064464474 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 22:54:38 compute-0 nova_compute[182725]: 2026-01-22 22:54:38.463 182729 DEBUG nova.network.neutron [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updated VIF entry in instance network info cache for port 964c5203-3352-4645-a500-ff3ce1e8d508. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:54:38 compute-0 nova_compute[182725]: 2026-01-22 22:54:38.465 182729 DEBUG nova.network.neutron [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updating instance_info_cache with network_info: [{"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:54:38 compute-0 nova_compute[182725]: 2026-01-22 22:54:38.493 182729 DEBUG oslo_concurrency.lockutils [req-6bfc775f-f74c-46b5-9a9b-9027f554fcdd req-fb1a09e4-d348-41d2-b2ee-84a0e2541e2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:54:39 compute-0 nova_compute[182725]: 2026-01-22 22:54:39.494 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:39 compute-0 nova_compute[182725]: 2026-01-22 22:54:39.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:39 compute-0 nova_compute[182725]: 2026-01-22 22:54:39.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:54:39 compute-0 nova_compute[182725]: 2026-01-22 22:54:39.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:54:40 compute-0 nova_compute[182725]: 2026-01-22 22:54:40.046 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:40 compute-0 nova_compute[182725]: 2026-01-22 22:54:40.160 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:54:40 compute-0 nova_compute[182725]: 2026-01-22 22:54:40.161 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:54:40 compute-0 nova_compute[182725]: 2026-01-22 22:54:40.161 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:54:40 compute-0 nova_compute[182725]: 2026-01-22 22:54:40.162 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2d5faa8e-c258-40cc-85a4-fa95e31593c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.635 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updating instance_info_cache with network_info: [{"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.656 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.657 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.912 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.913 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:54:41 compute-0 nova_compute[182725]: 2026-01-22 22:54:41.972 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.034 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.035 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.087 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.227 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.228 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5496MB free_disk=73.31549072265625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.229 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.229 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.446 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 2d5faa8e-c258-40cc-85a4-fa95e31593c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.446 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.446 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.605 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.620 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.640 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:54:42 compute-0 nova_compute[182725]: 2026-01-22 22:54:42.640 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:54:43 compute-0 ovn_controller[94850]: 2026-01-22T22:54:43Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:30:f4 10.100.0.12
Jan 22 22:54:43 compute-0 ovn_controller[94850]: 2026-01-22T22:54:43Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:30:f4 10.100.0.12
Jan 22 22:54:44 compute-0 nova_compute[182725]: 2026-01-22 22:54:44.495 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:44 compute-0 nova_compute[182725]: 2026-01-22 22:54:44.641 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:45 compute-0 nova_compute[182725]: 2026-01-22 22:54:45.048 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:45 compute-0 nova_compute[182725]: 2026-01-22 22:54:45.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:45 compute-0 nova_compute[182725]: 2026-01-22 22:54:45.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:54:46 compute-0 nova_compute[182725]: 2026-01-22 22:54:46.904 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:47 compute-0 nova_compute[182725]: 2026-01-22 22:54:47.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:48 compute-0 nova_compute[182725]: 2026-01-22 22:54:48.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:54:48 compute-0 nova_compute[182725]: 2026-01-22 22:54:48.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:54:49 compute-0 nova_compute[182725]: 2026-01-22 22:54:49.497 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:50 compute-0 nova_compute[182725]: 2026-01-22 22:54:50.051 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:51 compute-0 podman[238632]: 2026-01-22 22:54:51.154184787 +0000 UTC m=+0.084929301 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:54:54 compute-0 nova_compute[182725]: 2026-01-22 22:54:54.499 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:55 compute-0 nova_compute[182725]: 2026-01-22 22:54:55.053 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:58 compute-0 podman[238654]: 2026-01-22 22:54:58.121763821 +0000 UTC m=+0.057948211 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350)
Jan 22 22:54:58 compute-0 podman[238653]: 2026-01-22 22:54:58.142923017 +0000 UTC m=+0.081783323 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 22:54:59 compute-0 nova_compute[182725]: 2026-01-22 22:54:59.501 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:54:59 compute-0 nova_compute[182725]: 2026-01-22 22:54:59.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:00 compute-0 nova_compute[182725]: 2026-01-22 22:55:00.055 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:04 compute-0 nova_compute[182725]: 2026-01-22 22:55:04.502 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:05 compute-0 nova_compute[182725]: 2026-01-22 22:55:05.057 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:09 compute-0 podman[238704]: 2026-01-22 22:55:09.110596062 +0000 UTC m=+0.050386383 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:55:09 compute-0 podman[238706]: 2026-01-22 22:55:09.134568038 +0000 UTC m=+0.067400366 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:55:09 compute-0 podman[238705]: 2026-01-22 22:55:09.144978007 +0000 UTC m=+0.070349560 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 22:55:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:09.503 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:55:09 compute-0 nova_compute[182725]: 2026-01-22 22:55:09.503 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:09.503 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:55:09 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:09.504 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:10 compute-0 nova_compute[182725]: 2026-01-22 22:55:10.123 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:12.465 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:12.467 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:12.468 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:13 compute-0 nova_compute[182725]: 2026-01-22 22:55:13.903 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:13 compute-0 nova_compute[182725]: 2026-01-22 22:55:13.903 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 22:55:13 compute-0 nova_compute[182725]: 2026-01-22 22:55:13.917 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 22:55:14 compute-0 nova_compute[182725]: 2026-01-22 22:55:14.506 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:15 compute-0 nova_compute[182725]: 2026-01-22 22:55:15.173 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.574 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.575 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.598 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.734 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.734 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.742 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.742 182729 INFO nova.compute.claims [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.903 182729 DEBUG nova.compute.provider_tree [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.919 182729 DEBUG nova.scheduler.client.report [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.946 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:17 compute-0 nova_compute[182725]: 2026-01-22 22:55:17.947 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.023 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.024 182729 DEBUG nova.network.neutron [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.039 182729 INFO nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.056 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.177 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.179 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.179 182729 INFO nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Creating image(s)
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.181 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.181 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.182 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.211 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.264 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.266 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.266 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.284 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.336 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.338 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.372 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.373 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.374 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.451 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.452 182729 DEBUG nova.virt.disk.api [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.452 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.504 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.505 182729 DEBUG nova.virt.disk.api [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.505 182729 DEBUG nova.objects.instance [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.517 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.518 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Ensure instance console log exists: /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.518 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.518 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:18 compute-0 nova_compute[182725]: 2026-01-22 22:55:18.519 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:19 compute-0 nova_compute[182725]: 2026-01-22 22:55:19.178 182729 DEBUG nova.policy [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:55:19 compute-0 nova_compute[182725]: 2026-01-22 22:55:19.507 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:20 compute-0 nova_compute[182725]: 2026-01-22 22:55:20.176 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:22 compute-0 podman[238784]: 2026-01-22 22:55:22.123608789 +0000 UTC m=+0.059534590 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:55:22 compute-0 nova_compute[182725]: 2026-01-22 22:55:22.295 182729 DEBUG nova.network.neutron [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Successfully created port: da8778bf-4a44-45d9-ad3d-1ec45d20b728 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.372 182729 DEBUG nova.network.neutron [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Successfully updated port: da8778bf-4a44-45d9-ad3d-1ec45d20b728 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.392 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.392 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.393 182729 DEBUG nova.network.neutron [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.490 182729 DEBUG nova.compute.manager [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-changed-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.491 182729 DEBUG nova.compute.manager [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Refreshing instance network info cache due to event network-changed-da8778bf-4a44-45d9-ad3d-1ec45d20b728. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.491 182729 DEBUG oslo_concurrency.lockutils [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:55:24 compute-0 nova_compute[182725]: 2026-01-22 22:55:24.509 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:25 compute-0 nova_compute[182725]: 2026-01-22 22:55:25.165 182729 DEBUG nova.network.neutron [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:55:25 compute-0 nova_compute[182725]: 2026-01-22 22:55:25.178 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.327 182729 DEBUG nova.network.neutron [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updating instance_info_cache with network_info: [{"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.370 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.371 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Instance network_info: |[{"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.372 182729 DEBUG oslo_concurrency.lockutils [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.372 182729 DEBUG nova.network.neutron [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Refreshing network info cache for port da8778bf-4a44-45d9-ad3d-1ec45d20b728 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.378 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Start _get_guest_xml network_info=[{"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.387 182729 WARNING nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.395 182729 DEBUG nova.virt.libvirt.host [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.396 182729 DEBUG nova.virt.libvirt.host [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.401 182729 DEBUG nova.virt.libvirt.host [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.402 182729 DEBUG nova.virt.libvirt.host [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.403 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.404 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.405 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.405 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.406 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.406 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.407 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.407 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.407 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.408 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.408 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.409 182729 DEBUG nova.virt.hardware [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.416 182729 DEBUG nova.virt.libvirt.vif [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1523065053',display_name='tempest-TestGettingAddress-server-1523065053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1523065053',id=179,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDxyF6oDs5NL5SIJdgv11Cb5pkqsRGPmB3cHPKqEJ7v8YrcNdEWhfsxiVfH+ECyaszYZaZ2to32RK3TfjuRjA3+A+nm4Vjw7gDY4fF8gxKEWJ5+K8RKZHavOEN/KGbXqvQ==',key_name='tempest-TestGettingAddress-38314300',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-81xquuf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:55:18Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.417 182729 DEBUG nova.network.os_vif_util [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.418 182729 DEBUG nova.network.os_vif_util [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:e0:df,bridge_name='br-int',has_traffic_filtering=True,id=da8778bf-4a44-45d9-ad3d-1ec45d20b728,network=Network(0dcb11b3-4f88-477e-8e29-469839246ce6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda8778bf-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.420 182729 DEBUG nova.objects.instance [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.438 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <uuid>4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d</uuid>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <name>instance-000000b3</name>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <nova:name>tempest-TestGettingAddress-server-1523065053</nova:name>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:55:26</nova:creationTime>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         <nova:port uuid="da8778bf-4a44-45d9-ad3d-1ec45d20b728">
Jan 22 22:55:26 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe6a:e0df" ipVersion="6"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe6a:e0df" ipVersion="6"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <system>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <entry name="serial">4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d</entry>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <entry name="uuid">4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d</entry>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </system>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <os>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   </os>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <features>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   </features>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk.config"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:6a:e0:df"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <target dev="tapda8778bf-4a"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/console.log" append="off"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <video>
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </video>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:55:26 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:55:26 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:55:26 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:55:26 compute-0 nova_compute[182725]: </domain>
Jan 22 22:55:26 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.438 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Preparing to wait for external event network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.439 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.439 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.441 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.442 182729 DEBUG nova.virt.libvirt.vif [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1523065053',display_name='tempest-TestGettingAddress-server-1523065053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1523065053',id=179,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDxyF6oDs5NL5SIJdgv11Cb5pkqsRGPmB3cHPKqEJ7v8YrcNdEWhfsxiVfH+ECyaszYZaZ2to32RK3TfjuRjA3+A+nm4Vjw7gDY4fF8gxKEWJ5+K8RKZHavOEN/KGbXqvQ==',key_name='tempest-TestGettingAddress-38314300',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-81xquuf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:55:18Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.443 182729 DEBUG nova.network.os_vif_util [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.444 182729 DEBUG nova.network.os_vif_util [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:e0:df,bridge_name='br-int',has_traffic_filtering=True,id=da8778bf-4a44-45d9-ad3d-1ec45d20b728,network=Network(0dcb11b3-4f88-477e-8e29-469839246ce6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda8778bf-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.445 182729 DEBUG os_vif [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:e0:df,bridge_name='br-int',has_traffic_filtering=True,id=da8778bf-4a44-45d9-ad3d-1ec45d20b728,network=Network(0dcb11b3-4f88-477e-8e29-469839246ce6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda8778bf-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.446 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.447 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.448 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.452 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.452 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda8778bf-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.453 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda8778bf-4a, col_values=(('external_ids', {'iface-id': 'da8778bf-4a44-45d9-ad3d-1ec45d20b728', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:e0:df', 'vm-uuid': '4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.455 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:26 compute-0 NetworkManager[54954]: <info>  [1769122526.4564] manager: (tapda8778bf-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.458 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.463 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.465 182729 INFO os_vif [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:e0:df,bridge_name='br-int',has_traffic_filtering=True,id=da8778bf-4a44-45d9-ad3d-1ec45d20b728,network=Network(0dcb11b3-4f88-477e-8e29-469839246ce6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda8778bf-4a')
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.511 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.512 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.512 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:6a:e0:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:55:26 compute-0 nova_compute[182725]: 2026-01-22 22:55:26.513 182729 INFO nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Using config drive
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.314 182729 INFO nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Creating config drive at /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk.config
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.318 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpconczd0x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.442 182729 DEBUG oslo_concurrency.processutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpconczd0x" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:27 compute-0 kernel: tapda8778bf-4a: entered promiscuous mode
Jan 22 22:55:27 compute-0 NetworkManager[54954]: <info>  [1769122527.5102] manager: (tapda8778bf-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/346)
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.512 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 ovn_controller[94850]: 2026-01-22T22:55:27Z|00730|binding|INFO|Claiming lport da8778bf-4a44-45d9-ad3d-1ec45d20b728 for this chassis.
Jan 22 22:55:27 compute-0 ovn_controller[94850]: 2026-01-22T22:55:27Z|00731|binding|INFO|da8778bf-4a44-45d9-ad3d-1ec45d20b728: Claiming fa:16:3e:6a:e0:df 10.100.0.13 2001:db8:0:1:f816:3eff:fe6a:e0df 2001:db8::f816:3eff:fe6a:e0df
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.524 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:e0:df 10.100.0.13 2001:db8:0:1:f816:3eff:fe6a:e0df 2001:db8::f816:3eff:fe6a:e0df'], port_security=['fa:16:3e:6a:e0:df 10.100.0.13 2001:db8:0:1:f816:3eff:fe6a:e0df 2001:db8::f816:3eff:fe6a:e0df'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe6a:e0df/64 2001:db8::f816:3eff:fe6a:e0df/64', 'neutron:device_id': '4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f1c75d27-f026-46dc-bbcc-ae8b83a80943', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3589f7d-c75d-4b83-bcd4-26c17c46dcf1, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=da8778bf-4a44-45d9-ad3d-1ec45d20b728) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:55:27 compute-0 ovn_controller[94850]: 2026-01-22T22:55:27Z|00732|binding|INFO|Setting lport da8778bf-4a44-45d9-ad3d-1ec45d20b728 ovn-installed in OVS
Jan 22 22:55:27 compute-0 ovn_controller[94850]: 2026-01-22T22:55:27Z|00733|binding|INFO|Setting lport da8778bf-4a44-45d9-ad3d-1ec45d20b728 up in Southbound
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.526 104215 INFO neutron.agent.ovn.metadata.agent [-] Port da8778bf-4a44-45d9-ad3d-1ec45d20b728 in datapath 0dcb11b3-4f88-477e-8e29-469839246ce6 bound to our chassis
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.527 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0dcb11b3-4f88-477e-8e29-469839246ce6
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.527 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.528 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.538 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b974fb5e-54b9-4960-ad12-f648f7e1e1e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.539 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0dcb11b3-41 in ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:55:27 compute-0 systemd-udevd[238824]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.540 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0dcb11b3-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.540 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a578fcae-af19-4ba1-8bfe-964939be3d39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.541 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5a8ca7-05b4-450b-9093-19a06707e335]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 systemd-machined[154006]: New machine qemu-77-instance-000000b3.
Jan 22 22:55:27 compute-0 NetworkManager[54954]: <info>  [1769122527.5531] device (tapda8778bf-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:55:27 compute-0 NetworkManager[54954]: <info>  [1769122527.5537] device (tapda8778bf-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.552 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[86d003d2-5775-4435-994e-1ce50bf6c9da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 systemd[1]: Started Virtual Machine qemu-77-instance-000000b3.
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.565 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[16449150-415a-4a20-8d98-5627fa83bfff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.594 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[4f04f796-0e70-4b94-a5f6-5a65c07ad718]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 NetworkManager[54954]: <info>  [1769122527.5997] manager: (tap0dcb11b3-40): new Veth device (/org/freedesktop/NetworkManager/Devices/347)
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.598 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fc98e7de-9944-4395-82d8-5883fb1094bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 systemd-udevd[238828]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.630 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[f2846bd1-f364-42b9-b6ad-d6cec08f5e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.633 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[bf0d67b5-c1f0-46f4-ba59-092013a52e3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 NetworkManager[54954]: <info>  [1769122527.6524] device (tap0dcb11b3-40): carrier: link connected
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.658 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[05587751-eeb6-46b5-903c-9479b445297e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.673 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[990c141a-3ffc-4f14-8ff1-669899cc4577]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0dcb11b3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:14:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616525, 'reachable_time': 36655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238857, 'error': None, 'target': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.688 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3f17d1ec-b8fd-46f5-8665-146302d7edd2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:140a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616525, 'tstamp': 616525}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238858, 'error': None, 'target': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.703 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd1a8ed-4df7-4a14-bf2d-3b7b1c423c36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0dcb11b3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:14:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616525, 'reachable_time': 36655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238859, 'error': None, 'target': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.731 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[63c645da-2aea-41bf-a4ea-b79393fcab85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.788 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[77176e8c-3671-4dc5-8a17-112ae7643e11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.789 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0dcb11b3-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.790 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.790 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0dcb11b3-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.792 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 NetworkManager[54954]: <info>  [1769122527.7927] manager: (tap0dcb11b3-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Jan 22 22:55:27 compute-0 kernel: tap0dcb11b3-40: entered promiscuous mode
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.794 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.795 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0dcb11b3-40, col_values=(('external_ids', {'iface-id': '0ee482bf-08e3-4c08-b19d-3799def66c4e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:27 compute-0 ovn_controller[94850]: 2026-01-22T22:55:27Z|00734|binding|INFO|Releasing lport 0ee482bf-08e3-4c08-b19d-3799def66c4e from this chassis (sb_readonly=0)
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.796 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.797 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.799 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122527.7990832, 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.799 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0dcb11b3-4f88-477e-8e29-469839246ce6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0dcb11b3-4f88-477e-8e29-469839246ce6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.799 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] VM Started (Lifecycle Event)
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.800 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[97dcbde9-88eb-4aa7-bf35-b2fd3e34fbf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.801 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-0dcb11b3-4f88-477e-8e29-469839246ce6
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/0dcb11b3-4f88-477e-8e29-469839246ce6.pid.haproxy
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 0dcb11b3-4f88-477e-8e29-469839246ce6
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:55:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:27.801 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'env', 'PROCESS_TAG=haproxy-0dcb11b3-4f88-477e-8e29-469839246ce6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0dcb11b3-4f88-477e-8e29-469839246ce6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.808 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.982 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.994 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122527.8023334, 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:55:27 compute-0 nova_compute[182725]: 2026-01-22 22:55:27.995 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] VM Paused (Lifecycle Event)
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.018 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.022 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.044 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:55:28 compute-0 podman[238898]: 2026-01-22 22:55:28.163175649 +0000 UTC m=+0.053847339 container create ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 22:55:28 compute-0 systemd[1]: Started libpod-conmon-ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9.scope.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.211 182729 DEBUG nova.compute.manager [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-changed-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.212 182729 DEBUG nova.compute.manager [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Refreshing instance network info cache due to event network-changed-964c5203-3352-4645-a500-ff3ce1e8d508. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.212 182729 DEBUG oslo_concurrency.lockutils [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.212 182729 DEBUG oslo_concurrency.lockutils [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.212 182729 DEBUG nova.network.neutron [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Refreshing network info cache for port 964c5203-3352-4645-a500-ff3ce1e8d508 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:55:28 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:55:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1c6bc2e9ccef15fbe3fbac38f357177fa5d84d8776b582d00c3ddbc4ff92793/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:55:28 compute-0 podman[238898]: 2026-01-22 22:55:28.231896997 +0000 UTC m=+0.122568717 container init ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 22:55:28 compute-0 podman[238898]: 2026-01-22 22:55:28.138602299 +0000 UTC m=+0.029274009 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:55:28 compute-0 podman[238898]: 2026-01-22 22:55:28.238108032 +0000 UTC m=+0.128779722 container start ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 22:55:28 compute-0 podman[238914]: 2026-01-22 22:55:28.260281623 +0000 UTC m=+0.062120915 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350)
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6[238915]: [NOTICE]   (238945) : New worker (238959) forked
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6[238915]: [NOTICE]   (238945) : Loading success.
Jan 22 22:55:28 compute-0 podman[238911]: 2026-01-22 22:55:28.280778582 +0000 UTC m=+0.085033104 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.295 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.296 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.296 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.296 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.296 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.308 182729 INFO nova.compute.manager [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Terminating instance
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.318 182729 DEBUG nova.compute.manager [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:55:28 compute-0 kernel: tap964c5203-33 (unregistering): left promiscuous mode
Jan 22 22:55:28 compute-0 NetworkManager[54954]: <info>  [1769122528.3439] device (tap964c5203-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.353 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 ovn_controller[94850]: 2026-01-22T22:55:28Z|00735|binding|INFO|Releasing lport 964c5203-3352-4645-a500-ff3ce1e8d508 from this chassis (sb_readonly=0)
Jan 22 22:55:28 compute-0 ovn_controller[94850]: 2026-01-22T22:55:28Z|00736|binding|INFO|Setting lport 964c5203-3352-4645-a500-ff3ce1e8d508 down in Southbound
Jan 22 22:55:28 compute-0 ovn_controller[94850]: 2026-01-22T22:55:28Z|00737|binding|INFO|Removing iface tap964c5203-33 ovn-installed in OVS
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.355 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.365 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.370 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:30:f4 10.100.0.12'], port_security=['fa:16:3e:98:30:f4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2d5faa8e-c258-40cc-85a4-fa95e31593c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1ed5133b-95fa-4539-885a-e8aa5db43dd3 77169cf9-4ddb-4a48-a907-fa4abc0d69fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd7cbcb5-c231-44bb-be1c-c0898fbee74d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=964c5203-3352-4645-a500-ff3ce1e8d508) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.372 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 964c5203-3352-4645-a500-ff3ce1e8d508 in datapath 930b9b12-ffcc-452a-86e1-0321bc77aa71 unbound from our chassis
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.373 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 930b9b12-ffcc-452a-86e1-0321bc77aa71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.375 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[55fd4380-4615-4297-b072-3688a4c35969]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.375 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 namespace which is not needed anymore
Jan 22 22:55:28 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Jan 22 22:55:28 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000b0.scope: Consumed 14.922s CPU time.
Jan 22 22:55:28 compute-0 systemd-machined[154006]: Machine qemu-76-instance-000000b0 terminated.
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [NOTICE]   (238529) : haproxy version is 2.8.14-c23fe91
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [NOTICE]   (238529) : path to executable is /usr/sbin/haproxy
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [WARNING]  (238529) : Exiting Master process...
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [WARNING]  (238529) : Exiting Master process...
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [ALERT]    (238529) : Current worker (238532) exited with code 143 (Terminated)
Jan 22 22:55:28 compute-0 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[238519]: [WARNING]  (238529) : All workers exited. Exiting... (0)
Jan 22 22:55:28 compute-0 systemd[1]: libpod-ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7.scope: Deactivated successfully.
Jan 22 22:55:28 compute-0 podman[238991]: 2026-01-22 22:55:28.504917033 +0000 UTC m=+0.042287872 container died ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 22:55:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd81dd37ae966979601cf75e6bc5a9d457bbf583957991384209c139b5ccc56c-merged.mount: Deactivated successfully.
Jan 22 22:55:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7-userdata-shm.mount: Deactivated successfully.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.536 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 podman[238991]: 2026-01-22 22:55:28.539471312 +0000 UTC m=+0.076842161 container cleanup ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.541 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 systemd[1]: libpod-conmon-ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7.scope: Deactivated successfully.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.575 182729 INFO nova.virt.libvirt.driver [-] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Instance destroyed successfully.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.576 182729 DEBUG nova.objects.instance [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'resources' on Instance uuid 2d5faa8e-c258-40cc-85a4-fa95e31593c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.594 182729 DEBUG nova.virt.libvirt.vif [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:54:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2146779521',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2146779521',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=176,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF3bM7NQGtG5yriXbrRucGgbXDO+Ih+16YwqKkCqi6iYV70uP3NcZajayx6+zddPMbVIGqZDPRroiJyEP2VGI5ncm7A4UPQ1aQzuh23PRRclUINeQtZ2TOXHx39xQlJ4JA==',key_name='tempest-TestSecurityGroupsBasicOps-227428177',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:54:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-apphhjiq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:54:31Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=2d5faa8e-c258-40cc-85a4-fa95e31593c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.594 182729 DEBUG nova.network.os_vif_util [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.595 182729 DEBUG nova.network.os_vif_util [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:30:f4,bridge_name='br-int',has_traffic_filtering=True,id=964c5203-3352-4645-a500-ff3ce1e8d508,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap964c5203-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.596 182729 DEBUG os_vif [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:30:f4,bridge_name='br-int',has_traffic_filtering=True,id=964c5203-3352-4645-a500-ff3ce1e8d508,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap964c5203-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.598 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.598 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap964c5203-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.601 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.602 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.605 182729 DEBUG nova.compute.manager [req-2593c279-51bd-4776-aaf6-77738ef14ff9 req-02298c02-3551-406e-bc61-3d3c13d7a0f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-vif-unplugged-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.605 182729 DEBUG oslo_concurrency.lockutils [req-2593c279-51bd-4776-aaf6-77738ef14ff9 req-02298c02-3551-406e-bc61-3d3c13d7a0f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.606 182729 DEBUG oslo_concurrency.lockutils [req-2593c279-51bd-4776-aaf6-77738ef14ff9 req-02298c02-3551-406e-bc61-3d3c13d7a0f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.606 182729 DEBUG oslo_concurrency.lockutils [req-2593c279-51bd-4776-aaf6-77738ef14ff9 req-02298c02-3551-406e-bc61-3d3c13d7a0f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.606 182729 DEBUG nova.compute.manager [req-2593c279-51bd-4776-aaf6-77738ef14ff9 req-02298c02-3551-406e-bc61-3d3c13d7a0f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] No waiting events found dispatching network-vif-unplugged-964c5203-3352-4645-a500-ff3ce1e8d508 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.606 182729 DEBUG nova.compute.manager [req-2593c279-51bd-4776-aaf6-77738ef14ff9 req-02298c02-3551-406e-bc61-3d3c13d7a0f7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-vif-unplugged-964c5203-3352-4645-a500-ff3ce1e8d508 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.607 182729 INFO os_vif [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:30:f4,bridge_name='br-int',has_traffic_filtering=True,id=964c5203-3352-4645-a500-ff3ce1e8d508,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap964c5203-33')
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.608 182729 INFO nova.virt.libvirt.driver [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Deleting instance files /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2_del
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.609 182729 INFO nova.virt.libvirt.driver [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Deletion of /var/lib/nova/instances/2d5faa8e-c258-40cc-85a4-fa95e31593c2_del complete
Jan 22 22:55:28 compute-0 podman[239030]: 2026-01-22 22:55:28.611057931 +0000 UTC m=+0.046381964 container remove ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.616 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[62c68d1b-a71e-478e-93b6-cee784985642]: (4, ('Thu Jan 22 10:55:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 (ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7)\nff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7\nThu Jan 22 10:55:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 (ff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7)\nff350a0425a551561ffe3ae861a6b75e46b85f98d37e2e485f92e032dff3eab7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.618 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[82304f9d-c356-48e0-8504-4b1ab595023b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.619 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap930b9b12-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.620 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 kernel: tap930b9b12-f0: left promiscuous mode
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.631 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.633 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3f05d55f-e7ae-401a-942f-35300b31833f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.651 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3caa5a02-4b59-4d1b-9bb1-64d28d914e13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.652 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e190e6d4-3b44-4ce7-9402-1c5a22c8c82f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.666 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc57ef6-0a4c-4d85-ba4f-6f74c8dec776]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610834, 'reachable_time': 15191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239053, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.668 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:55:28 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:28.668 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[5435972c-1642-464f-a541-916852d13088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d930b9b12\x2dffcc\x2d452a\x2d86e1\x2d0321bc77aa71.mount: Deactivated successfully.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.712 182729 INFO nova.compute.manager [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.713 182729 DEBUG oslo.service.loopingcall [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.713 182729 DEBUG nova.compute.manager [-] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.714 182729 DEBUG nova.network.neutron [-] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.757 182729 DEBUG nova.compute.manager [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.758 182729 DEBUG oslo_concurrency.lockutils [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.759 182729 DEBUG oslo_concurrency.lockutils [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.760 182729 DEBUG oslo_concurrency.lockutils [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.760 182729 DEBUG nova.compute.manager [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Processing event network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.760 182729 DEBUG nova.compute.manager [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.761 182729 DEBUG oslo_concurrency.lockutils [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.761 182729 DEBUG oslo_concurrency.lockutils [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.761 182729 DEBUG oslo_concurrency.lockutils [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.762 182729 DEBUG nova.compute.manager [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] No waiting events found dispatching network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.762 182729 WARNING nova.compute.manager [req-3dcea5de-44e9-494b-98cd-561c08c02567 req-ad9f9dc5-d86c-420d-b123-7e631de65510 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received unexpected event network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 for instance with vm_state building and task_state spawning.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.763 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.766 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122528.7664607, 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.766 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] VM Resumed (Lifecycle Event)
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.768 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.771 182729 INFO nova.virt.libvirt.driver [-] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Instance spawned successfully.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.772 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.794 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.796 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.802 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.803 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.803 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.803 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.804 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.804 182729 DEBUG nova.virt.libvirt.driver [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.812 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.874 182729 INFO nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Took 10.70 seconds to spawn the instance on the hypervisor.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.874 182729 DEBUG nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.951 182729 INFO nova.compute.manager [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Took 11.26 seconds to build instance.
Jan 22 22:55:28 compute-0 nova_compute[182725]: 2026-01-22 22:55:28.976 182729 DEBUG oslo_concurrency.lockutils [None req-d4a05464-186d-41f8-b79a-eebfae26b128 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.545 182729 DEBUG nova.network.neutron [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updated VIF entry in instance network info cache for port da8778bf-4a44-45d9-ad3d-1ec45d20b728. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.546 182729 DEBUG nova.network.neutron [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updating instance_info_cache with network_info: [{"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.559 182729 DEBUG oslo_concurrency.lockutils [req-0dbc9c7b-e902-4bde-8d10-280d5ababa13 req-13b0d001-c7e7-48c9-afba-e36c952c67cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.705 182729 DEBUG nova.network.neutron [-] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.721 182729 INFO nova.compute.manager [-] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Took 1.01 seconds to deallocate network for instance.
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.770 182729 DEBUG nova.network.neutron [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updated VIF entry in instance network info cache for port 964c5203-3352-4645-a500-ff3ce1e8d508. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.770 182729 DEBUG nova.network.neutron [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Updating instance_info_cache with network_info: [{"id": "964c5203-3352-4645-a500-ff3ce1e8d508", "address": "fa:16:3e:98:30:f4", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap964c5203-33", "ovs_interfaceid": "964c5203-3352-4645-a500-ff3ce1e8d508", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.801 182729 DEBUG oslo_concurrency.lockutils [req-c351783f-7240-4cd5-a1ff-c5c48c21a227 req-796c729c-5074-4bba-a1b4-1c56ba3cfb2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2d5faa8e-c258-40cc-85a4-fa95e31593c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.817 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.818 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.885 182729 DEBUG nova.compute.provider_tree [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.897 182729 DEBUG nova.scheduler.client.report [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.915 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:29 compute-0 nova_compute[182725]: 2026-01-22 22:55:29.960 182729 INFO nova.scheduler.client.report [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Deleted allocations for instance 2d5faa8e-c258-40cc-85a4-fa95e31593c2
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.049 182729 DEBUG oslo_concurrency.lockutils [None req-0ab9f887-a02f-4d68-83ab-68fdd2221c25 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.214 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.322 182729 DEBUG nova.compute.manager [req-5c8de5c1-87dd-4582-b7fe-411e8c9a63d6 req-d88dd19e-34a1-42c1-904a-6b9ee3749c1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-vif-deleted-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.322 182729 INFO nova.compute.manager [req-5c8de5c1-87dd-4582-b7fe-411e8c9a63d6 req-d88dd19e-34a1-42c1-904a-6b9ee3749c1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Neutron deleted interface 964c5203-3352-4645-a500-ff3ce1e8d508; detaching it from the instance and deleting it from the info cache
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.322 182729 DEBUG nova.network.neutron [req-5c8de5c1-87dd-4582-b7fe-411e8c9a63d6 req-d88dd19e-34a1-42c1-904a-6b9ee3749c1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.324 182729 DEBUG nova.compute.manager [req-5c8de5c1-87dd-4582-b7fe-411e8c9a63d6 req-d88dd19e-34a1-42c1-904a-6b9ee3749c1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Detach interface failed, port_id=964c5203-3352-4645-a500-ff3ce1e8d508, reason: Instance 2d5faa8e-c258-40cc-85a4-fa95e31593c2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.723 182729 DEBUG nova.compute.manager [req-5b5abde7-10a2-48a5-9871-a7a7ed73c394 req-6ef4354a-3ac2-41ac-b3be-dd971bd8fc3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received event network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.723 182729 DEBUG oslo_concurrency.lockutils [req-5b5abde7-10a2-48a5-9871-a7a7ed73c394 req-6ef4354a-3ac2-41ac-b3be-dd971bd8fc3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.723 182729 DEBUG oslo_concurrency.lockutils [req-5b5abde7-10a2-48a5-9871-a7a7ed73c394 req-6ef4354a-3ac2-41ac-b3be-dd971bd8fc3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.723 182729 DEBUG oslo_concurrency.lockutils [req-5b5abde7-10a2-48a5-9871-a7a7ed73c394 req-6ef4354a-3ac2-41ac-b3be-dd971bd8fc3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2d5faa8e-c258-40cc-85a4-fa95e31593c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.723 182729 DEBUG nova.compute.manager [req-5b5abde7-10a2-48a5-9871-a7a7ed73c394 req-6ef4354a-3ac2-41ac-b3be-dd971bd8fc3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] No waiting events found dispatching network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:55:30 compute-0 nova_compute[182725]: 2026-01-22 22:55:30.724 182729 WARNING nova.compute.manager [req-5b5abde7-10a2-48a5-9871-a7a7ed73c394 req-6ef4354a-3ac2-41ac-b3be-dd971bd8fc3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Received unexpected event network-vif-plugged-964c5203-3352-4645-a500-ff3ce1e8d508 for instance with vm_state deleted and task_state None.
Jan 22 22:55:33 compute-0 nova_compute[182725]: 2026-01-22 22:55:33.602 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:33 compute-0 ovn_controller[94850]: 2026-01-22T22:55:33Z|00738|binding|INFO|Releasing lport 0ee482bf-08e3-4c08-b19d-3799def66c4e from this chassis (sb_readonly=0)
Jan 22 22:55:33 compute-0 nova_compute[182725]: 2026-01-22 22:55:33.921 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:34 compute-0 nova_compute[182725]: 2026-01-22 22:55:34.241 182729 DEBUG nova.compute.manager [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-changed-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:34 compute-0 nova_compute[182725]: 2026-01-22 22:55:34.241 182729 DEBUG nova.compute.manager [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Refreshing instance network info cache due to event network-changed-da8778bf-4a44-45d9-ad3d-1ec45d20b728. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:55:34 compute-0 nova_compute[182725]: 2026-01-22 22:55:34.241 182729 DEBUG oslo_concurrency.lockutils [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:55:34 compute-0 nova_compute[182725]: 2026-01-22 22:55:34.242 182729 DEBUG oslo_concurrency.lockutils [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:55:34 compute-0 nova_compute[182725]: 2026-01-22 22:55:34.242 182729 DEBUG nova.network.neutron [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Refreshing network info cache for port da8778bf-4a44-45d9-ad3d-1ec45d20b728 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:55:34 compute-0 nova_compute[182725]: 2026-01-22 22:55:34.897 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:35 compute-0 nova_compute[182725]: 2026-01-22 22:55:35.215 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:36 compute-0 nova_compute[182725]: 2026-01-22 22:55:36.206 182729 DEBUG nova.network.neutron [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updated VIF entry in instance network info cache for port da8778bf-4a44-45d9-ad3d-1ec45d20b728. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:55:36 compute-0 nova_compute[182725]: 2026-01-22 22:55:36.207 182729 DEBUG nova.network.neutron [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updating instance_info_cache with network_info: [{"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:36 compute-0 nova_compute[182725]: 2026-01-22 22:55:36.227 182729 DEBUG oslo_concurrency.lockutils [req-f6946716-29c2-44ff-937d-4e80aa34edc0 req-a0d557e5-d765-4574-9eac-87f55533f45d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:55:37 compute-0 nova_compute[182725]: 2026-01-22 22:55:37.535 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:38 compute-0 nova_compute[182725]: 2026-01-22 22:55:38.605 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:40 compute-0 podman[239076]: 2026-01-22 22:55:40.122595483 +0000 UTC m=+0.059100449 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:55:40 compute-0 podman[239077]: 2026-01-22 22:55:40.122592103 +0000 UTC m=+0.056023783 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 22:55:40 compute-0 podman[239078]: 2026-01-22 22:55:40.126060689 +0000 UTC m=+0.058210217 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:55:40 compute-0 ovn_controller[94850]: 2026-01-22T22:55:40Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:e0:df 10.100.0.13
Jan 22 22:55:40 compute-0 ovn_controller[94850]: 2026-01-22T22:55:40Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:e0:df 10.100.0.13
Jan 22 22:55:40 compute-0 nova_compute[182725]: 2026-01-22 22:55:40.240 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:40 compute-0 nova_compute[182725]: 2026-01-22 22:55:40.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:40 compute-0 nova_compute[182725]: 2026-01-22 22:55:40.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:55:40 compute-0 nova_compute[182725]: 2026-01-22 22:55:40.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:55:41 compute-0 nova_compute[182725]: 2026-01-22 22:55:41.188 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:55:41 compute-0 nova_compute[182725]: 2026-01-22 22:55:41.188 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:55:41 compute-0 nova_compute[182725]: 2026-01-22 22:55:41.188 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:55:41 compute-0 nova_compute[182725]: 2026-01-22 22:55:41.189 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:55:43 compute-0 nova_compute[182725]: 2026-01-22 22:55:43.574 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122528.572515, 2d5faa8e-c258-40cc-85a4-fa95e31593c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:55:43 compute-0 nova_compute[182725]: 2026-01-22 22:55:43.574 182729 INFO nova.compute.manager [-] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] VM Stopped (Lifecycle Event)
Jan 22 22:55:43 compute-0 nova_compute[182725]: 2026-01-22 22:55:43.607 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:43 compute-0 nova_compute[182725]: 2026-01-22 22:55:43.702 182729 DEBUG nova.compute.manager [None req-9e4fc48c-958c-413b-bc8d-f1a1bf6cf8c4 - - - - - -] [instance: 2d5faa8e-c258-40cc-85a4-fa95e31593c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.285 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.960 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updating instance_info_cache with network_info: [{"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.987 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.988 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.988 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.988 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.988 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:45 compute-0 nova_compute[182725]: 2026-01-22 22:55:45.989 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.047 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.048 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.048 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.048 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.132 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.193 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.195 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.258 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.405 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.406 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5459MB free_disk=73.28765869140625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.407 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.407 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.474 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.474 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.475 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.511 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.677 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.693 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:55:46 compute-0 nova_compute[182725]: 2026-01-22 22:55:46.694 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:47 compute-0 nova_compute[182725]: 2026-01-22 22:55:47.593 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:47.911 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:55:47 compute-0 nova_compute[182725]: 2026-01-22 22:55:47.911 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:47.912 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:55:48 compute-0 nova_compute[182725]: 2026-01-22 22:55:48.196 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:48 compute-0 nova_compute[182725]: 2026-01-22 22:55:48.609 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:48 compute-0 nova_compute[182725]: 2026-01-22 22:55:48.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:48 compute-0 nova_compute[182725]: 2026-01-22 22:55:48.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:55:48 compute-0 nova_compute[182725]: 2026-01-22 22:55:48.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:55:48 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:48.914 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:50 compute-0 nova_compute[182725]: 2026-01-22 22:55:50.288 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.033 182729 DEBUG nova.compute.manager [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-changed-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.034 182729 DEBUG nova.compute.manager [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Refreshing instance network info cache due to event network-changed-da8778bf-4a44-45d9-ad3d-1ec45d20b728. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.034 182729 DEBUG oslo_concurrency.lockutils [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.034 182729 DEBUG oslo_concurrency.lockutils [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.034 182729 DEBUG nova.network.neutron [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Refreshing network info cache for port da8778bf-4a44-45d9-ad3d-1ec45d20b728 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.089 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.089 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.089 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.090 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.090 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.102 182729 INFO nova.compute.manager [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Terminating instance
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.112 182729 DEBUG nova.compute.manager [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:55:53 compute-0 podman[239149]: 2026-01-22 22:55:53.120576493 +0000 UTC m=+0.053642244 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:55:53 compute-0 kernel: tapda8778bf-4a (unregistering): left promiscuous mode
Jan 22 22:55:53 compute-0 NetworkManager[54954]: <info>  [1769122553.1364] device (tapda8778bf-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.141 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 ovn_controller[94850]: 2026-01-22T22:55:53Z|00739|binding|INFO|Releasing lport da8778bf-4a44-45d9-ad3d-1ec45d20b728 from this chassis (sb_readonly=0)
Jan 22 22:55:53 compute-0 ovn_controller[94850]: 2026-01-22T22:55:53Z|00740|binding|INFO|Setting lport da8778bf-4a44-45d9-ad3d-1ec45d20b728 down in Southbound
Jan 22 22:55:53 compute-0 ovn_controller[94850]: 2026-01-22T22:55:53Z|00741|binding|INFO|Removing iface tapda8778bf-4a ovn-installed in OVS
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.144 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.150 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:e0:df 10.100.0.13 2001:db8:0:1:f816:3eff:fe6a:e0df 2001:db8::f816:3eff:fe6a:e0df'], port_security=['fa:16:3e:6a:e0:df 10.100.0.13 2001:db8:0:1:f816:3eff:fe6a:e0df 2001:db8::f816:3eff:fe6a:e0df'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe6a:e0df/64 2001:db8::f816:3eff:fe6a:e0df/64', 'neutron:device_id': '4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f1c75d27-f026-46dc-bbcc-ae8b83a80943', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3589f7d-c75d-4b83-bcd4-26c17c46dcf1, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=da8778bf-4a44-45d9-ad3d-1ec45d20b728) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.151 104215 INFO neutron.agent.ovn.metadata.agent [-] Port da8778bf-4a44-45d9-ad3d-1ec45d20b728 in datapath 0dcb11b3-4f88-477e-8e29-469839246ce6 unbound from our chassis
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.152 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0dcb11b3-4f88-477e-8e29-469839246ce6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.153 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ea6dd3-179a-4924-b5c0-0534eb57ce25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.153 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6 namespace which is not needed anymore
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.159 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Jan 22 22:55:53 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000b3.scope: Consumed 11.872s CPU time.
Jan 22 22:55:53 compute-0 systemd-machined[154006]: Machine qemu-77-instance-000000b3 terminated.
Jan 22 22:55:53 compute-0 neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6[238915]: [NOTICE]   (238945) : haproxy version is 2.8.14-c23fe91
Jan 22 22:55:53 compute-0 neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6[238915]: [NOTICE]   (238945) : path to executable is /usr/sbin/haproxy
Jan 22 22:55:53 compute-0 neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6[238915]: [WARNING]  (238945) : Exiting Master process...
Jan 22 22:55:53 compute-0 neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6[238915]: [ALERT]    (238945) : Current worker (238959) exited with code 143 (Terminated)
Jan 22 22:55:53 compute-0 neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6[238915]: [WARNING]  (238945) : All workers exited. Exiting... (0)
Jan 22 22:55:53 compute-0 systemd[1]: libpod-ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9.scope: Deactivated successfully.
Jan 22 22:55:53 compute-0 podman[239194]: 2026-01-22 22:55:53.282408566 +0000 UTC m=+0.045106992 container died ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:55:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9-userdata-shm.mount: Deactivated successfully.
Jan 22 22:55:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-c1c6bc2e9ccef15fbe3fbac38f357177fa5d84d8776b582d00c3ddbc4ff92793-merged.mount: Deactivated successfully.
Jan 22 22:55:53 compute-0 podman[239194]: 2026-01-22 22:55:53.319571709 +0000 UTC m=+0.082270165 container cleanup ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:55:53 compute-0 systemd[1]: libpod-conmon-ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9.scope: Deactivated successfully.
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.332 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.336 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.361 182729 DEBUG nova.compute.manager [req-1e7372ca-395c-4001-9747-50a9782d6542 req-6a0b3b8e-c566-4ea7-8735-131f7239c04b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-vif-unplugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.362 182729 DEBUG oslo_concurrency.lockutils [req-1e7372ca-395c-4001-9747-50a9782d6542 req-6a0b3b8e-c566-4ea7-8735-131f7239c04b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.362 182729 DEBUG oslo_concurrency.lockutils [req-1e7372ca-395c-4001-9747-50a9782d6542 req-6a0b3b8e-c566-4ea7-8735-131f7239c04b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.363 182729 DEBUG oslo_concurrency.lockutils [req-1e7372ca-395c-4001-9747-50a9782d6542 req-6a0b3b8e-c566-4ea7-8735-131f7239c04b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.363 182729 DEBUG nova.compute.manager [req-1e7372ca-395c-4001-9747-50a9782d6542 req-6a0b3b8e-c566-4ea7-8735-131f7239c04b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] No waiting events found dispatching network-vif-unplugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.363 182729 DEBUG nova.compute.manager [req-1e7372ca-395c-4001-9747-50a9782d6542 req-6a0b3b8e-c566-4ea7-8735-131f7239c04b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-vif-unplugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.386 182729 INFO nova.virt.libvirt.driver [-] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Instance destroyed successfully.
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.387 182729 DEBUG nova.objects.instance [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:55:53 compute-0 podman[239225]: 2026-01-22 22:55:53.389148779 +0000 UTC m=+0.041185295 container remove ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.395 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[608b55d0-119f-49f6-bb07-48478ac89c05]: (4, ('Thu Jan 22 10:55:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6 (ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9)\nce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9\nThu Jan 22 10:55:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6 (ce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9)\nce205e888d64ffcb9314dfddfe57aaca2a5dafff09f2edaeb6e4ef906feb2ab9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.396 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[3f13b562-b4d0-47e8-9528-3da096baa835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.397 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0dcb11b3-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.399 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 kernel: tap0dcb11b3-40: left promiscuous mode
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.406 182729 DEBUG nova.virt.libvirt.vif [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1523065053',display_name='tempest-TestGettingAddress-server-1523065053',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1523065053',id=179,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDxyF6oDs5NL5SIJdgv11Cb5pkqsRGPmB3cHPKqEJ7v8YrcNdEWhfsxiVfH+ECyaszYZaZ2to32RK3TfjuRjA3+A+nm4Vjw7gDY4fF8gxKEWJ5+K8RKZHavOEN/KGbXqvQ==',key_name='tempest-TestGettingAddress-38314300',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:55:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-81xquuf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:55:28Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.406 182729 DEBUG nova.network.os_vif_util [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.407 182729 DEBUG nova.network.os_vif_util [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:e0:df,bridge_name='br-int',has_traffic_filtering=True,id=da8778bf-4a44-45d9-ad3d-1ec45d20b728,network=Network(0dcb11b3-4f88-477e-8e29-469839246ce6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda8778bf-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.407 182729 DEBUG os_vif [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:e0:df,bridge_name='br-int',has_traffic_filtering=True,id=da8778bf-4a44-45d9-ad3d-1ec45d20b728,network=Network(0dcb11b3-4f88-477e-8e29-469839246ce6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda8778bf-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.410 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.410 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda8778bf-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.412 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.413 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.414 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.417 182729 INFO os_vif [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:e0:df,bridge_name='br-int',has_traffic_filtering=True,id=da8778bf-4a44-45d9-ad3d-1ec45d20b728,network=Network(0dcb11b3-4f88-477e-8e29-469839246ce6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda8778bf-4a')
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.417 182729 INFO nova.virt.libvirt.driver [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Deleting instance files /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d_del
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.418 182729 INFO nova.virt.libvirt.driver [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Deletion of /var/lib/nova/instances/4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d_del complete
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.418 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f5efec-21d4-4796-92d2-0328411b24ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.434 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[376d3896-026f-4d8e-9f6b-8b6c528636b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.435 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[78c85d3b-c104-488e-98be-f12f3e36762a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.449 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8e39e690-e0f8-46e4-a57a-6436ec2016b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616519, 'reachable_time': 25652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239255, 'error': None, 'target': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.451 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:55:53 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:55:53.452 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[091d8993-0cc7-46cf-a8f7-483771b5c2a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:55:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d0dcb11b3\x2d4f88\x2d477e\x2d8e29\x2d469839246ce6.mount: Deactivated successfully.
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.495 182729 INFO nova.compute.manager [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.496 182729 DEBUG oslo.service.loopingcall [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.496 182729 DEBUG nova.compute.manager [-] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:55:53 compute-0 nova_compute[182725]: 2026-01-22 22:55:53.497 182729 DEBUG nova.network.neutron [-] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:55:54 compute-0 nova_compute[182725]: 2026-01-22 22:55:54.866 182729 DEBUG nova.network.neutron [-] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:54 compute-0 nova_compute[182725]: 2026-01-22 22:55:54.884 182729 INFO nova.compute.manager [-] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Took 1.39 seconds to deallocate network for instance.
Jan 22 22:55:54 compute-0 nova_compute[182725]: 2026-01-22 22:55:54.981 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:54 compute-0 nova_compute[182725]: 2026-01-22 22:55:54.982 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:54 compute-0 nova_compute[182725]: 2026-01-22 22:55:54.984 182729 DEBUG nova.compute.manager [req-144388b2-3b44-4eb6-90ae-c1a266fff84a req-af95c2e5-846c-4698-a2a2-f9c3642a02df 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-vif-deleted-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.032 182729 DEBUG nova.compute.provider_tree [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.043 182729 DEBUG nova.scheduler.client.report [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.070 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.091 182729 INFO nova.scheduler.client.report [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.165 182729 DEBUG oslo_concurrency.lockutils [None req-fcc2eecf-924c-4cde-9582-dec0dcbcf294 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.291 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.325 182729 DEBUG nova.network.neutron [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updated VIF entry in instance network info cache for port da8778bf-4a44-45d9-ad3d-1ec45d20b728. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.325 182729 DEBUG nova.network.neutron [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Updating instance_info_cache with network_info: [{"id": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "address": "fa:16:3e:6a:e0:df", "network": {"id": "0dcb11b3-4f88-477e-8e29-469839246ce6", "bridge": "br-int", "label": "tempest-network-smoke--1204287640", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:e0df", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda8778bf-4a", "ovs_interfaceid": "da8778bf-4a44-45d9-ad3d-1ec45d20b728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.349 182729 DEBUG oslo_concurrency.lockutils [req-3d848ac1-885a-4b14-a4af-3945baa78b38 req-aca6b932-3926-4eec-be59-93f71392a4b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.464 182729 DEBUG nova.compute.manager [req-b07488d0-f986-4763-b355-edb0b9ab8f49 req-bdb0b937-b7e1-4c16-ac84-a80e2dec70c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received event network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.465 182729 DEBUG oslo_concurrency.lockutils [req-b07488d0-f986-4763-b355-edb0b9ab8f49 req-bdb0b937-b7e1-4c16-ac84-a80e2dec70c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.465 182729 DEBUG oslo_concurrency.lockutils [req-b07488d0-f986-4763-b355-edb0b9ab8f49 req-bdb0b937-b7e1-4c16-ac84-a80e2dec70c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.465 182729 DEBUG oslo_concurrency.lockutils [req-b07488d0-f986-4763-b355-edb0b9ab8f49 req-bdb0b937-b7e1-4c16-ac84-a80e2dec70c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.466 182729 DEBUG nova.compute.manager [req-b07488d0-f986-4763-b355-edb0b9ab8f49 req-bdb0b937-b7e1-4c16-ac84-a80e2dec70c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] No waiting events found dispatching network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:55:55 compute-0 nova_compute[182725]: 2026-01-22 22:55:55.466 182729 WARNING nova.compute.manager [req-b07488d0-f986-4763-b355-edb0b9ab8f49 req-bdb0b937-b7e1-4c16-ac84-a80e2dec70c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Received unexpected event network-vif-plugged-da8778bf-4a44-45d9-ad3d-1ec45d20b728 for instance with vm_state deleted and task_state None.
Jan 22 22:55:58 compute-0 nova_compute[182725]: 2026-01-22 22:55:58.412 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:55:59 compute-0 podman[239257]: 2026-01-22 22:55:59.119721698 +0000 UTC m=+0.050944007 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc.)
Jan 22 22:55:59 compute-0 podman[239256]: 2026-01-22 22:55:59.175989847 +0000 UTC m=+0.109019081 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 22 22:56:00 compute-0 nova_compute[182725]: 2026-01-22 22:56:00.293 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:01 compute-0 nova_compute[182725]: 2026-01-22 22:56:01.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:03 compute-0 sshd-session[239303]: Invalid user AdminGPON from 45.148.10.121 port 42862
Jan 22 22:56:03 compute-0 sshd-session[239303]: Connection closed by invalid user AdminGPON 45.148.10.121 port 42862 [preauth]
Jan 22 22:56:03 compute-0 nova_compute[182725]: 2026-01-22 22:56:03.414 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:05 compute-0 nova_compute[182725]: 2026-01-22 22:56:05.352 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:08 compute-0 nova_compute[182725]: 2026-01-22 22:56:08.385 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122553.383724, 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:56:08 compute-0 nova_compute[182725]: 2026-01-22 22:56:08.385 182729 INFO nova.compute.manager [-] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] VM Stopped (Lifecycle Event)
Jan 22 22:56:08 compute-0 nova_compute[182725]: 2026-01-22 22:56:08.408 182729 DEBUG nova.compute.manager [None req-d875e521-8539-4ca7-92c8-9ddc30a7617c - - - - - -] [instance: 4f2c09a6-cb99-4588-9dd8-b5ef3fd9aa6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:56:08 compute-0 nova_compute[182725]: 2026-01-22 22:56:08.416 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:56:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:56:10 compute-0 nova_compute[182725]: 2026-01-22 22:56:10.355 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:11 compute-0 podman[239307]: 2026-01-22 22:56:11.115198661 +0000 UTC m=+0.048774194 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 22:56:11 compute-0 podman[239306]: 2026-01-22 22:56:11.131480545 +0000 UTC m=+0.059493889 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 22:56:11 compute-0 podman[239308]: 2026-01-22 22:56:11.153992475 +0000 UTC m=+0.082694086 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:56:11 compute-0 nova_compute[182725]: 2026-01-22 22:56:11.597 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:11 compute-0 nova_compute[182725]: 2026-01-22 22:56:11.699 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:12.466 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:12.467 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:12.467 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:13 compute-0 nova_compute[182725]: 2026-01-22 22:56:13.417 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:15 compute-0 nova_compute[182725]: 2026-01-22 22:56:15.357 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:18 compute-0 nova_compute[182725]: 2026-01-22 22:56:18.418 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:20 compute-0 nova_compute[182725]: 2026-01-22 22:56:20.359 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:23 compute-0 nova_compute[182725]: 2026-01-22 22:56:23.420 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:24 compute-0 podman[239372]: 2026-01-22 22:56:24.149218175 +0000 UTC m=+0.080154144 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 22:56:25 compute-0 nova_compute[182725]: 2026-01-22 22:56:25.361 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:28 compute-0 nova_compute[182725]: 2026-01-22 22:56:28.422 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:30 compute-0 podman[239394]: 2026-01-22 22:56:30.147893848 +0000 UTC m=+0.076560834 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Jan 22 22:56:30 compute-0 podman[239393]: 2026-01-22 22:56:30.184776485 +0000 UTC m=+0.109533994 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:56:30 compute-0 nova_compute[182725]: 2026-01-22 22:56:30.362 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:32.068 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:87:28 10.100.0.2 2001:db8::f816:3eff:fe34:8728'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe34:8728/64', 'neutron:device_id': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a169643-63c0-4f43-aa55-2402edf1efd7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7e638a4c-ee74-4c71-b2dd-f7bbad609134) old=Port_Binding(mac=['fa:16:3e:34:87:28 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:56:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:32.070 104215 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7e638a4c-ee74-4c71-b2dd-f7bbad609134 in datapath d8d77c31-420b-47d9-87ac-6c37fe7e216d updated
Jan 22 22:56:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:32.071 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8d77c31-420b-47d9-87ac-6c37fe7e216d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:56:32 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:32.072 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[74c3083e-5704-4f08-9edd-6f429b803dc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:33 compute-0 nova_compute[182725]: 2026-01-22 22:56:33.423 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:35 compute-0 nova_compute[182725]: 2026-01-22 22:56:35.044 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:35 compute-0 nova_compute[182725]: 2026-01-22 22:56:35.415 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.305 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.307 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.332 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.426 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.472 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.473 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.481 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.481 182729 INFO nova.compute.claims [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.641 182729 DEBUG nova.compute.provider_tree [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.658 182729 DEBUG nova.scheduler.client.report [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.686 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.686 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.768 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.769 182729 DEBUG nova.network.neutron [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.811 182729 INFO nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.829 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.947 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.948 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.948 182729 INFO nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Creating image(s)
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.949 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.949 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.950 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:38 compute-0 nova_compute[182725]: 2026-01-22 22:56:38.967 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.025 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.026 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.027 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.039 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.093 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.094 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.127 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.128 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.128 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.179 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.181 182729 DEBUG nova.virt.disk.api [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.182 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.213 182729 DEBUG nova.policy [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.277 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.278 182729 DEBUG nova.virt.disk.api [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.278 182729 DEBUG nova.objects.instance [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid d9ca14c7-c946-461f-a618-05e1e60d8b75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.302 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.303 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Ensure instance console log exists: /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.303 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.304 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:39 compute-0 nova_compute[182725]: 2026-01-22 22:56:39.304 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:40 compute-0 nova_compute[182725]: 2026-01-22 22:56:40.417 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:40 compute-0 nova_compute[182725]: 2026-01-22 22:56:40.732 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:40.733 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:56:40 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:40.734 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:56:40 compute-0 nova_compute[182725]: 2026-01-22 22:56:40.832 182729 DEBUG nova.network.neutron [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Successfully created port: 36036bd3-ee74-4b35-b2b5-8d585f1ae59a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:56:42 compute-0 podman[239455]: 2026-01-22 22:56:42.121110425 +0000 UTC m=+0.058657279 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 22:56:42 compute-0 podman[239456]: 2026-01-22 22:56:42.135500293 +0000 UTC m=+0.069038547 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:56:42 compute-0 podman[239454]: 2026-01-22 22:56:42.135722658 +0000 UTC m=+0.067400996 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.244 182729 DEBUG nova.network.neutron [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Successfully updated port: 36036bd3-ee74-4b35-b2b5-8d585f1ae59a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.260 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.260 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.260 182729 DEBUG nova.network.neutron [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.370 182729 DEBUG nova.compute.manager [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-changed-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.370 182729 DEBUG nova.compute.manager [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Refreshing instance network info cache due to event network-changed-36036bd3-ee74-4b35-b2b5-8d585f1ae59a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.371 182729 DEBUG oslo_concurrency.lockutils [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.435 182729 DEBUG nova.network.neutron [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:56:42 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:42.736 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.920 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 22:56:42 compute-0 nova_compute[182725]: 2026-01-22 22:56:42.921 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:56:43 compute-0 nova_compute[182725]: 2026-01-22 22:56:43.459 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:43 compute-0 nova_compute[182725]: 2026-01-22 22:56:43.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:43 compute-0 nova_compute[182725]: 2026-01-22 22:56:43.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:43 compute-0 nova_compute[182725]: 2026-01-22 22:56:43.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:43 compute-0 nova_compute[182725]: 2026-01-22 22:56:43.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:43 compute-0 nova_compute[182725]: 2026-01-22 22:56:43.918 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.082 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.084 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5694MB free_disk=73.31648635864258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.084 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.084 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.141 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance d9ca14c7-c946-461f-a618-05e1e60d8b75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.141 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.142 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.167 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.190 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.191 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.203 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.222 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.269 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.307 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.343 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:56:44 compute-0 nova_compute[182725]: 2026-01-22 22:56:44.344 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.236 182729 DEBUG nova.network.neutron [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updating instance_info_cache with network_info: [{"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.266 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.267 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Instance network_info: |[{"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.267 182729 DEBUG oslo_concurrency.lockutils [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.267 182729 DEBUG nova.network.neutron [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Refreshing network info cache for port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.269 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Start _get_guest_xml network_info=[{"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.274 182729 WARNING nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.278 182729 DEBUG nova.virt.libvirt.host [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.279 182729 DEBUG nova.virt.libvirt.host [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.285 182729 DEBUG nova.virt.libvirt.host [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.285 182729 DEBUG nova.virt.libvirt.host [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.286 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.287 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.287 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.287 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.287 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.288 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.288 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.288 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.288 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.289 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.289 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.289 182729 DEBUG nova.virt.hardware [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.292 182729 DEBUG nova.virt.libvirt.vif [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1901361657',display_name='tempest-TestGettingAddress-server-1901361657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1901361657',id=182,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHZ1KMZo+qWxA9GoBPq33zivQ9decEOfNbceKLZHuHOqsDs/tyA3D4sE3L5JZ1B6KO/xDwF+p7p9iYpYMQYQwF7tVFf0iZITIwPPCAfwIzY9rt9qx275gQpD1U0a/ULPw==',key_name='tempest-TestGettingAddress-1399143294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-pyobw218',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:56:38Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=d9ca14c7-c946-461f-a618-05e1e60d8b75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.292 182729 DEBUG nova.network.os_vif_util [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.293 182729 DEBUG nova.network.os_vif_util [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:f1:19,bridge_name='br-int',has_traffic_filtering=True,id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36036bd3-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.294 182729 DEBUG nova.objects.instance [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid d9ca14c7-c946-461f-a618-05e1e60d8b75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.320 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <uuid>d9ca14c7-c946-461f-a618-05e1e60d8b75</uuid>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <name>instance-000000b6</name>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <nova:name>tempest-TestGettingAddress-server-1901361657</nova:name>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:56:45</nova:creationTime>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         <nova:port uuid="36036bd3-ee74-4b35-b2b5-8d585f1ae59a">
Jan 22 22:56:45 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe8e:f119" ipVersion="6"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <system>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <entry name="serial">d9ca14c7-c946-461f-a618-05e1e60d8b75</entry>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <entry name="uuid">d9ca14c7-c946-461f-a618-05e1e60d8b75</entry>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </system>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <os>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   </os>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <features>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   </features>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk.config"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:8e:f1:19"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <target dev="tap36036bd3-ee"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/console.log" append="off"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <video>
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </video>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:56:45 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:56:45 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:56:45 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:56:45 compute-0 nova_compute[182725]: </domain>
Jan 22 22:56:45 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.321 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Preparing to wait for external event network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.321 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.322 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.322 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.322 182729 DEBUG nova.virt.libvirt.vif [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1901361657',display_name='tempest-TestGettingAddress-server-1901361657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1901361657',id=182,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHZ1KMZo+qWxA9GoBPq33zivQ9decEOfNbceKLZHuHOqsDs/tyA3D4sE3L5JZ1B6KO/xDwF+p7p9iYpYMQYQwF7tVFf0iZITIwPPCAfwIzY9rt9qx275gQpD1U0a/ULPw==',key_name='tempest-TestGettingAddress-1399143294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-pyobw218',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:56:38Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=d9ca14c7-c946-461f-a618-05e1e60d8b75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.323 182729 DEBUG nova.network.os_vif_util [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.324 182729 DEBUG nova.network.os_vif_util [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:f1:19,bridge_name='br-int',has_traffic_filtering=True,id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36036bd3-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.324 182729 DEBUG os_vif [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:f1:19,bridge_name='br-int',has_traffic_filtering=True,id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36036bd3-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.324 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.325 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.325 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.327 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.327 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36036bd3-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.328 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36036bd3-ee, col_values=(('external_ids', {'iface-id': '36036bd3-ee74-4b35-b2b5-8d585f1ae59a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:f1:19', 'vm-uuid': 'd9ca14c7-c946-461f-a618-05e1e60d8b75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.329 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:45 compute-0 NetworkManager[54954]: <info>  [1769122605.3314] manager: (tap36036bd3-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.332 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.336 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.337 182729 INFO os_vif [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:f1:19,bridge_name='br-int',has_traffic_filtering=True,id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36036bd3-ee')
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.343 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.343 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.378 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.378 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.378 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:8e:f1:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.379 182729 INFO nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Using config drive
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.418 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:45 compute-0 nova_compute[182725]: 2026-01-22 22:56:45.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.231 182729 INFO nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Creating config drive at /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk.config
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.240 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ybl5un3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.371 182729 DEBUG oslo_concurrency.processutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ybl5un3" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:56:46 compute-0 kernel: tap36036bd3-ee: entered promiscuous mode
Jan 22 22:56:46 compute-0 NetworkManager[54954]: <info>  [1769122606.4661] manager: (tap36036bd3-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/350)
Jan 22 22:56:46 compute-0 ovn_controller[94850]: 2026-01-22T22:56:46Z|00742|binding|INFO|Claiming lport 36036bd3-ee74-4b35-b2b5-8d585f1ae59a for this chassis.
Jan 22 22:56:46 compute-0 ovn_controller[94850]: 2026-01-22T22:56:46Z|00743|binding|INFO|36036bd3-ee74-4b35-b2b5-8d585f1ae59a: Claiming fa:16:3e:8e:f1:19 10.100.0.12 2001:db8::f816:3eff:fe8e:f119
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.468 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.474 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.481 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.493 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:f1:19 10.100.0.12 2001:db8::f816:3eff:fe8e:f119'], port_security=['fa:16:3e:8e:f1:19 10.100.0.12 2001:db8::f816:3eff:fe8e:f119'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8::f816:3eff:fe8e:f119/64', 'neutron:device_id': 'd9ca14c7-c946-461f-a618-05e1e60d8b75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5ed139db-5528-4ba0-9d69-09cd70ed61c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a169643-63c0-4f43-aa55-2402edf1efd7, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=36036bd3-ee74-4b35-b2b5-8d585f1ae59a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.494 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a in datapath d8d77c31-420b-47d9-87ac-6c37fe7e216d bound to our chassis
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.495 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8d77c31-420b-47d9-87ac-6c37fe7e216d
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.505 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7f8587ac-1d11-40a6-ab9e-c7bd4b994a4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.506 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8d77c31-41 in ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.507 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8d77c31-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.508 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e5cfe514-6c43-44cc-83a0-c2eb754935ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.508 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a82767-c3a9-4fd2-bd32-f60d4aab6b17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.523 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[9961ecc6-a1f7-4582-a42f-f9caf81e6fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 systemd-machined[154006]: New machine qemu-78-instance-000000b6.
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.562 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f308e3df-bc11-45fa-8b21-d5c853e4a3be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_controller[94850]: 2026-01-22T22:56:46Z|00744|binding|INFO|Setting lport 36036bd3-ee74-4b35-b2b5-8d585f1ae59a ovn-installed in OVS
Jan 22 22:56:46 compute-0 ovn_controller[94850]: 2026-01-22T22:56:46Z|00745|binding|INFO|Setting lport 36036bd3-ee74-4b35-b2b5-8d585f1ae59a up in Southbound
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.593 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[752be199-4cdd-4990-93ac-437b1458c47f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 systemd[1]: Started Virtual Machine qemu-78-instance-000000b6.
Jan 22 22:56:46 compute-0 NetworkManager[54954]: <info>  [1769122606.6026] manager: (tapd8d77c31-40): new Veth device (/org/freedesktop/NetworkManager/Devices/351)
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.601 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[57407a61-31f9-4229-9c0f-3e6e3681f0f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.601 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 systemd-udevd[239545]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:56:46 compute-0 systemd-udevd[239543]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:56:46 compute-0 NetworkManager[54954]: <info>  [1769122606.6274] device (tap36036bd3-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:56:46 compute-0 NetworkManager[54954]: <info>  [1769122606.6286] device (tap36036bd3-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.645 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[10745923-d183-413f-90e5-5f4bacb482e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.648 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8e0adfc1-9764-4d22-a04f-9fd496b38f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 NetworkManager[54954]: <info>  [1769122606.6744] device (tapd8d77c31-40): carrier: link connected
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.676 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e7278076-274e-423b-9bfb-63611651ab40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.697 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c980365d-dddd-408a-821e-386d72c024f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8d77c31-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:87:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624427, 'reachable_time': 36281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239570, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.714 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b6c163-f78c-4144-b6ff-688e8c319d18]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe34:8728'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624427, 'tstamp': 624427}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239572, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.732 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[55c7ca6e-0c17-4579-adce-dc4351133062]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8d77c31-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:87:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624427, 'reachable_time': 36281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239573, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.762 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f919c1-ae20-415e-9302-06d10402ea98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.832 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[00322e03-7b31-49e8-99e9-c7e1b3edcad0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.833 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8d77c31-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.833 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.834 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8d77c31-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:56:46 compute-0 NetworkManager[54954]: <info>  [1769122606.8366] manager: (tapd8d77c31-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.836 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 kernel: tapd8d77c31-40: entered promiscuous mode
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.840 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.842 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8d77c31-40, col_values=(('external_ids', {'iface-id': '7e638a4c-ee74-4c71-b2dd-f7bbad609134'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.843 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 ovn_controller[94850]: 2026-01-22T22:56:46Z|00746|binding|INFO|Releasing lport 7e638a4c-ee74-4c71-b2dd-f7bbad609134 from this chassis (sb_readonly=0)
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.859 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.860 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8d77c31-420b-47d9-87ac-6c37fe7e216d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8d77c31-420b-47d9-87ac-6c37fe7e216d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.860 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[5f060136-bdfa-4a2e-b938-05cddfcffcba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.861 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-d8d77c31-420b-47d9-87ac-6c37fe7e216d
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/d8d77c31-420b-47d9-87ac-6c37fe7e216d.pid.haproxy
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID d8d77c31-420b-47d9-87ac-6c37fe7e216d
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:56:46 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:56:46.861 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'env', 'PROCESS_TAG=haproxy-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8d77c31-420b-47d9-87ac-6c37fe7e216d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.961 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122606.9608734, d9ca14c7-c946-461f-a618-05e1e60d8b75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.961 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] VM Started (Lifecycle Event)
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.979 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.984 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122606.96104, d9ca14c7-c946-461f-a618-05e1e60d8b75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:56:46 compute-0 nova_compute[182725]: 2026-01-22 22:56:46.984 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] VM Paused (Lifecycle Event)
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.017 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.025 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.047 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:56:47 compute-0 podman[239612]: 2026-01-22 22:56:47.239741136 +0000 UTC m=+0.049810439 container create 5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 22:56:47 compute-0 systemd[1]: Started libpod-conmon-5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d.scope.
Jan 22 22:56:47 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:56:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9af65d681361a0d1f22ec212d76b48c087c8c821154cd42550efde7d954b2b95/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:56:47 compute-0 podman[239612]: 2026-01-22 22:56:47.211536365 +0000 UTC m=+0.021605708 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.307 182729 DEBUG nova.compute.manager [req-c8b17631-921a-4a65-8c26-e8ee69253502 req-be4d1d8e-4022-4705-a6e1-71ecd67d5e83 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.308 182729 DEBUG oslo_concurrency.lockutils [req-c8b17631-921a-4a65-8c26-e8ee69253502 req-be4d1d8e-4022-4705-a6e1-71ecd67d5e83 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.308 182729 DEBUG oslo_concurrency.lockutils [req-c8b17631-921a-4a65-8c26-e8ee69253502 req-be4d1d8e-4022-4705-a6e1-71ecd67d5e83 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.309 182729 DEBUG oslo_concurrency.lockutils [req-c8b17631-921a-4a65-8c26-e8ee69253502 req-be4d1d8e-4022-4705-a6e1-71ecd67d5e83 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.309 182729 DEBUG nova.compute.manager [req-c8b17631-921a-4a65-8c26-e8ee69253502 req-be4d1d8e-4022-4705-a6e1-71ecd67d5e83 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Processing event network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.309 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:56:47 compute-0 podman[239612]: 2026-01-22 22:56:47.312243228 +0000 UTC m=+0.122312531 container init 5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.313 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122607.3129733, d9ca14c7-c946-461f-a618-05e1e60d8b75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.313 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] VM Resumed (Lifecycle Event)
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.314 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.317 182729 INFO nova.virt.libvirt.driver [-] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Instance spawned successfully.
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.318 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:56:47 compute-0 podman[239612]: 2026-01-22 22:56:47.319935069 +0000 UTC m=+0.130004362 container start 5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.337 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.343 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.345 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.346 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.346 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:56:47 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [NOTICE]   (239633) : New worker (239635) forked
Jan 22 22:56:47 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [NOTICE]   (239633) : Loading success.
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.346 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.347 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.347 182729 DEBUG nova.virt.libvirt.driver [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.380 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.443 182729 INFO nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Took 8.50 seconds to spawn the instance on the hypervisor.
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.443 182729 DEBUG nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.536 182729 INFO nova.compute.manager [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Took 9.12 seconds to build instance.
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.556 182729 DEBUG oslo_concurrency.lockutils [None req-f9435651-a72b-4238-a3d3-012d10aaa4ad 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.620 182729 DEBUG nova.network.neutron [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updated VIF entry in instance network info cache for port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.621 182729 DEBUG nova.network.neutron [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updating instance_info_cache with network_info: [{"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.636 182729 DEBUG oslo_concurrency.lockutils [req-21626dd2-caa8-435c-ba80-151c0bee52e3 req-e3d38cb5-734f-475c-87c8-7cee8043d67e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:56:47 compute-0 nova_compute[182725]: 2026-01-22 22:56:47.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.402 182729 DEBUG nova.compute.manager [req-30dd66b8-2ed9-4812-912b-f4c629e9c420 req-27fb8416-631b-455e-9a65-7b4ccd94443d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.402 182729 DEBUG oslo_concurrency.lockutils [req-30dd66b8-2ed9-4812-912b-f4c629e9c420 req-27fb8416-631b-455e-9a65-7b4ccd94443d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.403 182729 DEBUG oslo_concurrency.lockutils [req-30dd66b8-2ed9-4812-912b-f4c629e9c420 req-27fb8416-631b-455e-9a65-7b4ccd94443d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.403 182729 DEBUG oslo_concurrency.lockutils [req-30dd66b8-2ed9-4812-912b-f4c629e9c420 req-27fb8416-631b-455e-9a65-7b4ccd94443d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.403 182729 DEBUG nova.compute.manager [req-30dd66b8-2ed9-4812-912b-f4c629e9c420 req-27fb8416-631b-455e-9a65-7b4ccd94443d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] No waiting events found dispatching network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.404 182729 WARNING nova.compute.manager [req-30dd66b8-2ed9-4812-912b-f4c629e9c420 req-27fb8416-631b-455e-9a65-7b4ccd94443d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received unexpected event network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a for instance with vm_state active and task_state None.
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:49 compute-0 nova_compute[182725]: 2026-01-22 22:56:49.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:56:50 compute-0 nova_compute[182725]: 2026-01-22 22:56:50.330 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:50 compute-0 nova_compute[182725]: 2026-01-22 22:56:50.419 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:50 compute-0 nova_compute[182725]: 2026-01-22 22:56:50.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:56:51 compute-0 NetworkManager[54954]: <info>  [1769122611.3635] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.362 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:51 compute-0 NetworkManager[54954]: <info>  [1769122611.3647] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.424 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:51 compute-0 ovn_controller[94850]: 2026-01-22T22:56:51Z|00747|binding|INFO|Releasing lport 7e638a4c-ee74-4c71-b2dd-f7bbad609134 from this chassis (sb_readonly=0)
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.437 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.671 182729 DEBUG nova.compute.manager [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-changed-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.672 182729 DEBUG nova.compute.manager [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Refreshing instance network info cache due to event network-changed-36036bd3-ee74-4b35-b2b5-8d585f1ae59a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.673 182729 DEBUG oslo_concurrency.lockutils [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.674 182729 DEBUG oslo_concurrency.lockutils [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:56:51 compute-0 nova_compute[182725]: 2026-01-22 22:56:51.674 182729 DEBUG nova.network.neutron [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Refreshing network info cache for port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:56:53 compute-0 nova_compute[182725]: 2026-01-22 22:56:53.350 182729 DEBUG nova.network.neutron [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updated VIF entry in instance network info cache for port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:56:53 compute-0 nova_compute[182725]: 2026-01-22 22:56:53.350 182729 DEBUG nova.network.neutron [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updating instance_info_cache with network_info: [{"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:56:53 compute-0 nova_compute[182725]: 2026-01-22 22:56:53.373 182729 DEBUG oslo_concurrency.lockutils [req-7b51623f-4d7a-4065-9ad9-b4222ade77ac req-80ab5f06-3e4d-4e8a-9746-913398de2042 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:56:55 compute-0 podman[239645]: 2026-01-22 22:56:55.160258186 +0000 UTC m=+0.079312562 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 22:56:55 compute-0 nova_compute[182725]: 2026-01-22 22:56:55.332 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:56:55 compute-0 nova_compute[182725]: 2026-01-22 22:56:55.421 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:00 compute-0 ovn_controller[94850]: 2026-01-22T22:57:00Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8e:f1:19 10.100.0.12
Jan 22 22:57:00 compute-0 ovn_controller[94850]: 2026-01-22T22:57:00Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:f1:19 10.100.0.12
Jan 22 22:57:00 compute-0 nova_compute[182725]: 2026-01-22 22:57:00.335 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:00 compute-0 nova_compute[182725]: 2026-01-22 22:57:00.423 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:01 compute-0 anacron[103311]: Job `cron.weekly' started
Jan 22 22:57:01 compute-0 anacron[103311]: Job `cron.weekly' terminated
Jan 22 22:57:01 compute-0 podman[239687]: 2026-01-22 22:57:01.152752276 +0000 UTC m=+0.077445835 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git)
Jan 22 22:57:01 compute-0 podman[239686]: 2026-01-22 22:57:01.217340942 +0000 UTC m=+0.138297689 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:57:05 compute-0 nova_compute[182725]: 2026-01-22 22:57:05.339 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:05 compute-0 nova_compute[182725]: 2026-01-22 22:57:05.424 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:10 compute-0 nova_compute[182725]: 2026-01-22 22:57:10.342 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:10 compute-0 nova_compute[182725]: 2026-01-22 22:57:10.462 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:12 compute-0 ovn_controller[94850]: 2026-01-22T22:57:12Z|00748|binding|INFO|Releasing lport 7e638a4c-ee74-4c71-b2dd-f7bbad609134 from this chassis (sb_readonly=0)
Jan 22 22:57:12 compute-0 nova_compute[182725]: 2026-01-22 22:57:12.464 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:12.467 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:12.468 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:12.469 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:57:13 compute-0 podman[239735]: 2026-01-22 22:57:13.151462189 +0000 UTC m=+0.072417421 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 22:57:13 compute-0 podman[239736]: 2026-01-22 22:57:13.167333162 +0000 UTC m=+0.084889660 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 22:57:13 compute-0 podman[239737]: 2026-01-22 22:57:13.16766832 +0000 UTC m=+0.079409793 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 22:57:15 compute-0 nova_compute[182725]: 2026-01-22 22:57:15.345 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:15 compute-0 nova_compute[182725]: 2026-01-22 22:57:15.464 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:20 compute-0 nova_compute[182725]: 2026-01-22 22:57:20.349 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:20 compute-0 nova_compute[182725]: 2026-01-22 22:57:20.466 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:23 compute-0 nova_compute[182725]: 2026-01-22 22:57:23.444 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:24.353 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:57:24 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:24.354 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:57:24 compute-0 nova_compute[182725]: 2026-01-22 22:57:24.355 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:25 compute-0 nova_compute[182725]: 2026-01-22 22:57:25.351 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:25 compute-0 nova_compute[182725]: 2026-01-22 22:57:25.470 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:25 compute-0 nova_compute[182725]: 2026-01-22 22:57:25.784 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:26 compute-0 podman[239797]: 2026-01-22 22:57:26.185900761 +0000 UTC m=+0.098746275 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:57:27 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:27.357 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:57:30 compute-0 nova_compute[182725]: 2026-01-22 22:57:30.353 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:30 compute-0 nova_compute[182725]: 2026-01-22 22:57:30.473 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:32 compute-0 podman[239817]: 2026-01-22 22:57:32.122674966 +0000 UTC m=+0.056467215 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 22:57:32 compute-0 podman[239816]: 2026-01-22 22:57:32.165344696 +0000 UTC m=+0.094635233 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true)
Jan 22 22:57:34 compute-0 nova_compute[182725]: 2026-01-22 22:57:34.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:35 compute-0 nova_compute[182725]: 2026-01-22 22:57:35.119 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:35 compute-0 nova_compute[182725]: 2026-01-22 22:57:35.406 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:35 compute-0 nova_compute[182725]: 2026-01-22 22:57:35.475 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:40 compute-0 nova_compute[182725]: 2026-01-22 22:57:40.409 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:40 compute-0 nova_compute[182725]: 2026-01-22 22:57:40.477 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:43 compute-0 nova_compute[182725]: 2026-01-22 22:57:43.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:43 compute-0 nova_compute[182725]: 2026-01-22 22:57:43.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:57:43 compute-0 nova_compute[182725]: 2026-01-22 22:57:43.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:57:44 compute-0 podman[239861]: 2026-01-22 22:57:44.125423326 +0000 UTC m=+0.056139337 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:57:44 compute-0 podman[239868]: 2026-01-22 22:57:44.132121362 +0000 UTC m=+0.049007619 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:57:44 compute-0 podman[239862]: 2026-01-22 22:57:44.15456408 +0000 UTC m=+0.079601240 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 22:57:45 compute-0 nova_compute[182725]: 2026-01-22 22:57:45.456 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:45 compute-0 nova_compute[182725]: 2026-01-22 22:57:45.480 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:46 compute-0 nova_compute[182725]: 2026-01-22 22:57:46.093 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:57:46 compute-0 nova_compute[182725]: 2026-01-22 22:57:46.094 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:57:46 compute-0 nova_compute[182725]: 2026-01-22 22:57:46.094 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 22:57:46 compute-0 nova_compute[182725]: 2026-01-22 22:57:46.094 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d9ca14c7-c946-461f-a618-05e1e60d8b75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:57:50 compute-0 nova_compute[182725]: 2026-01-22 22:57:50.458 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:50 compute-0 nova_compute[182725]: 2026-01-22 22:57:50.481 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.198 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.961 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updating instance_info_cache with network_info: [{"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.982 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.982 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.982 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.982 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.982 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.983 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.983 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.983 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.983 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:57:53 compute-0 nova_compute[182725]: 2026-01-22 22:57:53.984 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.007 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.008 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.008 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.008 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.091 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.189 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.191 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.261 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.448 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.449 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5488MB free_disk=73.28753280639648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.449 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.450 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.550 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance d9ca14c7-c946-461f-a618-05e1e60d8b75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.550 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.550 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.635 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.647 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.680 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:57:54 compute-0 nova_compute[182725]: 2026-01-22 22:57:54.680 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.236 182729 DEBUG nova.compute.manager [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-changed-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.237 182729 DEBUG nova.compute.manager [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Refreshing instance network info cache due to event network-changed-36036bd3-ee74-4b35-b2b5-8d585f1ae59a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.237 182729 DEBUG oslo_concurrency.lockutils [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.237 182729 DEBUG oslo_concurrency.lockutils [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.238 182729 DEBUG nova.network.neutron [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Refreshing network info cache for port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.415 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.416 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.416 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.416 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.417 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.431 182729 INFO nova.compute.manager [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Terminating instance
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.442 182729 DEBUG nova.compute.manager [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.461 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:55 compute-0 kernel: tap36036bd3-ee (unregistering): left promiscuous mode
Jan 22 22:57:55 compute-0 NetworkManager[54954]: <info>  [1769122675.4736] device (tap36036bd3-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:57:55 compute-0 ovn_controller[94850]: 2026-01-22T22:57:55Z|00749|binding|INFO|Releasing lport 36036bd3-ee74-4b35-b2b5-8d585f1ae59a from this chassis (sb_readonly=0)
Jan 22 22:57:55 compute-0 ovn_controller[94850]: 2026-01-22T22:57:55Z|00750|binding|INFO|Setting lport 36036bd3-ee74-4b35-b2b5-8d585f1ae59a down in Southbound
Jan 22 22:57:55 compute-0 ovn_controller[94850]: 2026-01-22T22:57:55Z|00751|binding|INFO|Removing iface tap36036bd3-ee ovn-installed in OVS
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.479 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.501 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:55.520 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:f1:19 10.100.0.12 2001:db8::f816:3eff:fe8e:f119'], port_security=['fa:16:3e:8e:f1:19 10.100.0.12 2001:db8::f816:3eff:fe8e:f119'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8::f816:3eff:fe8e:f119/64', 'neutron:device_id': 'd9ca14c7-c946-461f-a618-05e1e60d8b75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5ed139db-5528-4ba0-9d69-09cd70ed61c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a169643-63c0-4f43-aa55-2402edf1efd7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=36036bd3-ee74-4b35-b2b5-8d585f1ae59a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:57:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:55.521 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a in datapath d8d77c31-420b-47d9-87ac-6c37fe7e216d unbound from our chassis
Jan 22 22:57:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:55.522 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8d77c31-420b-47d9-87ac-6c37fe7e216d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:57:55 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000b6.scope: Deactivated successfully.
Jan 22 22:57:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:55.523 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[42b5fc7d-ae96-4e96-b522-783e77c58476]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:55.524 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d namespace which is not needed anymore
Jan 22 22:57:55 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000b6.scope: Consumed 15.247s CPU time.
Jan 22 22:57:55 compute-0 systemd-machined[154006]: Machine qemu-78-instance-000000b6 terminated.
Jan 22 22:57:55 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [NOTICE]   (239633) : haproxy version is 2.8.14-c23fe91
Jan 22 22:57:55 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [NOTICE]   (239633) : path to executable is /usr/sbin/haproxy
Jan 22 22:57:55 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [WARNING]  (239633) : Exiting Master process...
Jan 22 22:57:55 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [WARNING]  (239633) : Exiting Master process...
Jan 22 22:57:55 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [ALERT]    (239633) : Current worker (239635) exited with code 143 (Terminated)
Jan 22 22:57:55 compute-0 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[239629]: [WARNING]  (239633) : All workers exited. Exiting... (0)
Jan 22 22:57:55 compute-0 systemd[1]: libpod-5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d.scope: Deactivated successfully.
Jan 22 22:57:55 compute-0 podman[239957]: 2026-01-22 22:57:55.699555553 +0000 UTC m=+0.096791887 container died 5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.708 182729 INFO nova.virt.libvirt.driver [-] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Instance destroyed successfully.
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.709 182729 DEBUG nova.objects.instance [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid d9ca14c7-c946-461f-a618-05e1e60d8b75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.732 182729 DEBUG nova.virt.libvirt.vif [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1901361657',display_name='tempest-TestGettingAddress-server-1901361657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1901361657',id=182,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHZ1KMZo+qWxA9GoBPq33zivQ9decEOfNbceKLZHuHOqsDs/tyA3D4sE3L5JZ1B6KO/xDwF+p7p9iYpYMQYQwF7tVFf0iZITIwPPCAfwIzY9rt9qx275gQpD1U0a/ULPw==',key_name='tempest-TestGettingAddress-1399143294',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:56:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-pyobw218',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:56:47Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=d9ca14c7-c946-461f-a618-05e1e60d8b75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.733 182729 DEBUG nova.network.os_vif_util [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.733 182729 DEBUG nova.network.os_vif_util [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:f1:19,bridge_name='br-int',has_traffic_filtering=True,id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36036bd3-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.734 182729 DEBUG os_vif [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:f1:19,bridge_name='br-int',has_traffic_filtering=True,id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36036bd3-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.735 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.735 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36036bd3-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.737 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.739 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.742 182729 INFO os_vif [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:f1:19,bridge_name='br-int',has_traffic_filtering=True,id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36036bd3-ee')
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.742 182729 INFO nova.virt.libvirt.driver [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Deleting instance files /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75_del
Jan 22 22:57:55 compute-0 nova_compute[182725]: 2026-01-22 22:57:55.743 182729 INFO nova.virt.libvirt.driver [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Deletion of /var/lib/nova/instances/d9ca14c7-c946-461f-a618-05e1e60d8b75_del complete
Jan 22 22:57:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d-userdata-shm.mount: Deactivated successfully.
Jan 22 22:57:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-9af65d681361a0d1f22ec212d76b48c087c8c821154cd42550efde7d954b2b95-merged.mount: Deactivated successfully.
Jan 22 22:57:55 compute-0 podman[239957]: 2026-01-22 22:57:55.758405986 +0000 UTC m=+0.155642280 container cleanup 5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 22:57:55 compute-0 systemd[1]: libpod-conmon-5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d.scope: Deactivated successfully.
Jan 22 22:57:56 compute-0 podman[240003]: 2026-01-22 22:57:56.040460326 +0000 UTC m=+0.261234154 container remove 5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.048 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e89cf3-d833-4506-97fd-8f7c8ba59d3e]: (4, ('Thu Jan 22 10:57:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d (5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d)\n5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d\nThu Jan 22 10:57:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d (5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d)\n5c915f1cd3384145427544ce6124b48836a55f908d4e84d5cf2723da9e6b492d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.051 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7eefd7-3e7d-4c31-9a11-3f64aae16fa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.052 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8d77c31-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:57:56 compute-0 kernel: tapd8d77c31-40: left promiscuous mode
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.055 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.061 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[65262d8b-2061-4f33-b5ad-c9fe8799b112]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.068 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.088 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1e04e47b-2be1-4bc2-b05e-5194a9328567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.090 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1f30b031-4504-4efc-aa72-8b17e85a64ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.109 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d67fe92a-5074-4154-ac32-3c8c8aaeef8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624419, 'reachable_time': 25155, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240018, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.111 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:57:56 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:57:56.112 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[84211d28-17da-4399-b3be-de901ff0b4c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:57:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dd8d77c31\x2d420b\x2d47d9\x2d87ac\x2d6c37fe7e216d.mount: Deactivated successfully.
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.311 182729 INFO nova.compute.manager [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Took 0.87 seconds to destroy the instance on the hypervisor.
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.312 182729 DEBUG oslo.service.loopingcall [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.312 182729 DEBUG nova.compute.manager [-] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.312 182729 DEBUG nova.network.neutron [-] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.531 182729 DEBUG nova.compute.manager [req-c3608d6e-cbd2-4d8d-97d6-6bc6c57adcd4 req-d6c7d20e-49a0-4631-9a73-4d445e80d9e8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-vif-unplugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.532 182729 DEBUG oslo_concurrency.lockutils [req-c3608d6e-cbd2-4d8d-97d6-6bc6c57adcd4 req-d6c7d20e-49a0-4631-9a73-4d445e80d9e8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.532 182729 DEBUG oslo_concurrency.lockutils [req-c3608d6e-cbd2-4d8d-97d6-6bc6c57adcd4 req-d6c7d20e-49a0-4631-9a73-4d445e80d9e8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.533 182729 DEBUG oslo_concurrency.lockutils [req-c3608d6e-cbd2-4d8d-97d6-6bc6c57adcd4 req-d6c7d20e-49a0-4631-9a73-4d445e80d9e8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.533 182729 DEBUG nova.compute.manager [req-c3608d6e-cbd2-4d8d-97d6-6bc6c57adcd4 req-d6c7d20e-49a0-4631-9a73-4d445e80d9e8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] No waiting events found dispatching network-vif-unplugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:57:56 compute-0 nova_compute[182725]: 2026-01-22 22:57:56.533 182729 DEBUG nova.compute.manager [req-c3608d6e-cbd2-4d8d-97d6-6bc6c57adcd4 req-d6c7d20e-49a0-4631-9a73-4d445e80d9e8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-vif-unplugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:57:57 compute-0 podman[240019]: 2026-01-22 22:57:57.188996452 +0000 UTC m=+0.110420986 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.563 182729 DEBUG nova.network.neutron [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updated VIF entry in instance network info cache for port 36036bd3-ee74-4b35-b2b5-8d585f1ae59a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.564 182729 DEBUG nova.network.neutron [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updating instance_info_cache with network_info: [{"id": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "address": "fa:16:3e:8e:f1:19", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8e:f119", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36036bd3-ee", "ovs_interfaceid": "36036bd3-ee74-4b35-b2b5-8d585f1ae59a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.780 182729 DEBUG nova.network.neutron [-] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.870 182729 DEBUG nova.compute.manager [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.871 182729 DEBUG oslo_concurrency.lockutils [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.871 182729 DEBUG oslo_concurrency.lockutils [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.871 182729 DEBUG oslo_concurrency.lockutils [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.872 182729 DEBUG nova.compute.manager [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] No waiting events found dispatching network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.872 182729 WARNING nova.compute.manager [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received unexpected event network-vif-plugged-36036bd3-ee74-4b35-b2b5-8d585f1ae59a for instance with vm_state active and task_state deleting.
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.872 182729 DEBUG nova.compute.manager [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Received event network-vif-deleted-36036bd3-ee74-4b35-b2b5-8d585f1ae59a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.872 182729 INFO nova.compute.manager [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Neutron deleted interface 36036bd3-ee74-4b35-b2b5-8d585f1ae59a; detaching it from the instance and deleting it from the info cache
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.872 182729 DEBUG nova.network.neutron [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:57:58 compute-0 nova_compute[182725]: 2026-01-22 22:57:58.910 182729 DEBUG oslo_concurrency.lockutils [req-4794a681-753b-4db0-8144-36a81f6dea8f req-bb983282-762f-4916-863f-d9356949ee4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-d9ca14c7-c946-461f-a618-05e1e60d8b75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.040 182729 INFO nova.compute.manager [-] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Took 2.73 seconds to deallocate network for instance.
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.043 182729 DEBUG nova.compute.manager [req-baa23b6b-2e5c-4876-8bc9-9044fafc39ea req-5a2b1b35-0ab8-4aba-8bc9-fc97774ba826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Detach interface failed, port_id=36036bd3-ee74-4b35-b2b5-8d585f1ae59a, reason: Instance d9ca14c7-c946-461f-a618-05e1e60d8b75 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.398 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.399 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.477 182729 DEBUG nova.compute.provider_tree [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.497 182729 DEBUG nova.scheduler.client.report [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.542 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.587 182729 INFO nova.scheduler.client.report [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance d9ca14c7-c946-461f-a618-05e1e60d8b75
Jan 22 22:57:59 compute-0 nova_compute[182725]: 2026-01-22 22:57:59.667 182729 DEBUG oslo_concurrency.lockutils [None req-cf9fb412-855d-4f4f-b6fb-9f2b0830c3e9 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "d9ca14c7-c946-461f-a618-05e1e60d8b75" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:58:00 compute-0 nova_compute[182725]: 2026-01-22 22:58:00.544 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:00 compute-0 nova_compute[182725]: 2026-01-22 22:58:00.737 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:03 compute-0 podman[240041]: 2026-01-22 22:58:03.118686864 +0000 UTC m=+0.053350987 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 22:58:03 compute-0 podman[240040]: 2026-01-22 22:58:03.138716812 +0000 UTC m=+0.078735138 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 22:58:05 compute-0 nova_compute[182725]: 2026-01-22 22:58:05.548 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:05 compute-0 nova_compute[182725]: 2026-01-22 22:58:05.739 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:08 compute-0 nova_compute[182725]: 2026-01-22 22:58:08.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 22:58:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 22:58:09 compute-0 nova_compute[182725]: 2026-01-22 22:58:09.143 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:10 compute-0 nova_compute[182725]: 2026-01-22 22:58:10.549 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:10 compute-0 nova_compute[182725]: 2026-01-22 22:58:10.708 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122675.7072759, d9ca14c7-c946-461f-a618-05e1e60d8b75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:58:10 compute-0 nova_compute[182725]: 2026-01-22 22:58:10.709 182729 INFO nova.compute.manager [-] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] VM Stopped (Lifecycle Event)
Jan 22 22:58:10 compute-0 nova_compute[182725]: 2026-01-22 22:58:10.735 182729 DEBUG nova.compute.manager [None req-7dfd775d-7d42-4b5e-9403-9409dda4983b - - - - - -] [instance: d9ca14c7-c946-461f-a618-05e1e60d8b75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:58:10 compute-0 nova_compute[182725]: 2026-01-22 22:58:10.740 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:58:12.467 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:58:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:58:12.468 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:58:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:58:12.468 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:58:15 compute-0 podman[240090]: 2026-01-22 22:58:15.133011523 +0000 UTC m=+0.061148121 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:58:15 compute-0 podman[240097]: 2026-01-22 22:58:15.149606806 +0000 UTC m=+0.060105345 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 22:58:15 compute-0 podman[240091]: 2026-01-22 22:58:15.151184935 +0000 UTC m=+0.071513329 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 22:58:15 compute-0 nova_compute[182725]: 2026-01-22 22:58:15.550 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:15 compute-0 nova_compute[182725]: 2026-01-22 22:58:15.742 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:16 compute-0 nova_compute[182725]: 2026-01-22 22:58:16.676 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:20 compute-0 nova_compute[182725]: 2026-01-22 22:58:20.552 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:20 compute-0 nova_compute[182725]: 2026-01-22 22:58:20.743 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:25 compute-0 nova_compute[182725]: 2026-01-22 22:58:25.554 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:25 compute-0 nova_compute[182725]: 2026-01-22 22:58:25.825 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:28 compute-0 podman[240154]: 2026-01-22 22:58:28.115579038 +0000 UTC m=+0.057231153 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 22:58:30 compute-0 nova_compute[182725]: 2026-01-22 22:58:30.557 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:30 compute-0 nova_compute[182725]: 2026-01-22 22:58:30.827 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:34 compute-0 podman[240175]: 2026-01-22 22:58:34.151374925 +0000 UTC m=+0.077305902 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git)
Jan 22 22:58:34 compute-0 podman[240174]: 2026-01-22 22:58:34.159716392 +0000 UTC m=+0.090479009 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 22:58:35 compute-0 nova_compute[182725]: 2026-01-22 22:58:35.559 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:35 compute-0 nova_compute[182725]: 2026-01-22 22:58:35.829 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:36 compute-0 nova_compute[182725]: 2026-01-22 22:58:36.919 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:40 compute-0 nova_compute[182725]: 2026-01-22 22:58:40.602 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:40 compute-0 nova_compute[182725]: 2026-01-22 22:58:40.831 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:43 compute-0 nova_compute[182725]: 2026-01-22 22:58:43.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:43 compute-0 nova_compute[182725]: 2026-01-22 22:58:43.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:58:43 compute-0 nova_compute[182725]: 2026-01-22 22:58:43.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:58:43 compute-0 nova_compute[182725]: 2026-01-22 22:58:43.920 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:58:45 compute-0 nova_compute[182725]: 2026-01-22 22:58:45.649 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:45 compute-0 nova_compute[182725]: 2026-01-22 22:58:45.833 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:45 compute-0 nova_compute[182725]: 2026-01-22 22:58:45.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:45 compute-0 nova_compute[182725]: 2026-01-22 22:58:45.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:46 compute-0 podman[240219]: 2026-01-22 22:58:46.164745832 +0000 UTC m=+0.087879975 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:58:46 compute-0 podman[240221]: 2026-01-22 22:58:46.168346452 +0000 UTC m=+0.079483407 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 22:58:46 compute-0 podman[240220]: 2026-01-22 22:58:46.180639967 +0000 UTC m=+0.095508254 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 22:58:46 compute-0 nova_compute[182725]: 2026-01-22 22:58:46.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:46 compute-0 nova_compute[182725]: 2026-01-22 22:58:46.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:46 compute-0 nova_compute[182725]: 2026-01-22 22:58:46.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:58:46 compute-0 nova_compute[182725]: 2026-01-22 22:58:46.918 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:58:46 compute-0 nova_compute[182725]: 2026-01-22 22:58:46.918 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:58:46 compute-0 nova_compute[182725]: 2026-01-22 22:58:46.918 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.119 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.120 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5681MB free_disk=73.31622314453125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.120 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.121 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.186 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.187 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.207 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.221 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.244 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:58:47 compute-0 nova_compute[182725]: 2026-01-22 22:58:47.245 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:58:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:58:49.399 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:58:49 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:58:49.399 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:58:49 compute-0 nova_compute[182725]: 2026-01-22 22:58:49.400 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:50 compute-0 nova_compute[182725]: 2026-01-22 22:58:50.650 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:50 compute-0 nova_compute[182725]: 2026-01-22 22:58:50.835 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:51 compute-0 nova_compute[182725]: 2026-01-22 22:58:51.245 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:51 compute-0 nova_compute[182725]: 2026-01-22 22:58:51.246 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:58:51 compute-0 nova_compute[182725]: 2026-01-22 22:58:51.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:53 compute-0 nova_compute[182725]: 2026-01-22 22:58:53.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:58:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:58:54.401 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:58:54 compute-0 ovn_controller[94850]: 2026-01-22T22:58:54Z|00752|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 22:58:55 compute-0 nova_compute[182725]: 2026-01-22 22:58:55.653 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:55 compute-0 nova_compute[182725]: 2026-01-22 22:58:55.837 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:58:59 compute-0 podman[240289]: 2026-01-22 22:58:59.139879852 +0000 UTC m=+0.061116700 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:59:00 compute-0 nova_compute[182725]: 2026-01-22 22:59:00.654 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:00 compute-0 nova_compute[182725]: 2026-01-22 22:59:00.839 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:05 compute-0 podman[240310]: 2026-01-22 22:59:05.181577536 +0000 UTC m=+0.100327255 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter)
Jan 22 22:59:05 compute-0 podman[240309]: 2026-01-22 22:59:05.185468283 +0000 UTC m=+0.112835906 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 22 22:59:05 compute-0 nova_compute[182725]: 2026-01-22 22:59:05.656 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:05 compute-0 nova_compute[182725]: 2026-01-22 22:59:05.841 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:10 compute-0 nova_compute[182725]: 2026-01-22 22:59:10.689 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:10 compute-0 nova_compute[182725]: 2026-01-22 22:59:10.842 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:12.468 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:12.469 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:12.469 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:15 compute-0 nova_compute[182725]: 2026-01-22 22:59:15.728 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:15 compute-0 nova_compute[182725]: 2026-01-22 22:59:15.844 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:17 compute-0 podman[240356]: 2026-01-22 22:59:17.145606606 +0000 UTC m=+0.063922630 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 22:59:17 compute-0 podman[240357]: 2026-01-22 22:59:17.149034021 +0000 UTC m=+0.058272639 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 22:59:17 compute-0 podman[240355]: 2026-01-22 22:59:17.179047107 +0000 UTC m=+0.094483629 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 22:59:20 compute-0 nova_compute[182725]: 2026-01-22 22:59:20.730 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:20 compute-0 nova_compute[182725]: 2026-01-22 22:59:20.846 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:25 compute-0 nova_compute[182725]: 2026-01-22 22:59:25.731 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:25 compute-0 nova_compute[182725]: 2026-01-22 22:59:25.849 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.337 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.338 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.355 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.463 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.464 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.470 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.471 182729 INFO nova.compute.claims [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Claim successful on node compute-0.ctlplane.example.com
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.606 182729 DEBUG nova.compute.provider_tree [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.624 182729 DEBUG nova.scheduler.client.report [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.653 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.653 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.734 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.735 182729 DEBUG nova.network.neutron [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.752 182729 INFO nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.774 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.919 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.921 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.922 182729 INFO nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Creating image(s)
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.925 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "/var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.926 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "/var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.927 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "/var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:29 compute-0 nova_compute[182725]: 2026-01-22 22:59:29.952 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.050 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.051 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.052 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.070 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:59:30 compute-0 podman[240420]: 2026-01-22 22:59:30.123303639 +0000 UTC m=+0.064485123 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.123 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.125 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.160 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.161 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.161 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.214 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.216 182729 DEBUG nova.virt.disk.api [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Checking if we can resize image /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.216 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.271 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.272 182729 DEBUG nova.virt.disk.api [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Cannot resize image /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.273 182729 DEBUG nova.objects.instance [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lazy-loading 'migration_context' on Instance uuid 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.292 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.293 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Ensure instance console log exists: /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.294 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.294 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.295 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.546 182729 DEBUG nova.network.neutron [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Successfully created port: b65785f8-c4bc-4484-8367-99c33433e919 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.734 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:30 compute-0 nova_compute[182725]: 2026-01-22 22:59:30.850 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.479 182729 DEBUG nova.network.neutron [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Successfully updated port: b65785f8-c4bc-4484-8367-99c33433e919 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.511 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "refresh_cache-73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.512 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquired lock "refresh_cache-73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.512 182729 DEBUG nova.network.neutron [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.588 182729 DEBUG nova.compute.manager [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received event network-changed-b65785f8-c4bc-4484-8367-99c33433e919 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.589 182729 DEBUG nova.compute.manager [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Refreshing instance network info cache due to event network-changed-b65785f8-c4bc-4484-8367-99c33433e919. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.589 182729 DEBUG oslo_concurrency.lockutils [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 22:59:31 compute-0 nova_compute[182725]: 2026-01-22 22:59:31.679 182729 DEBUG nova.network.neutron [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.354 182729 DEBUG nova.network.neutron [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Updating instance_info_cache with network_info: [{"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.396 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Releasing lock "refresh_cache-73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.397 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Instance network_info: |[{"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.398 182729 DEBUG oslo_concurrency.lockutils [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.399 182729 DEBUG nova.network.neutron [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Refreshing network info cache for port b65785f8-c4bc-4484-8367-99c33433e919 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.406 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Start _get_guest_xml network_info=[{"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.412 182729 WARNING nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.418 182729 DEBUG nova.virt.libvirt.host [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.420 182729 DEBUG nova.virt.libvirt.host [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.424 182729 DEBUG nova.virt.libvirt.host [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.424 182729 DEBUG nova.virt.libvirt.host [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.426 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.426 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.427 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.428 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.428 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.428 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.429 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.429 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.429 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.430 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.430 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.430 182729 DEBUG nova.virt.hardware [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.435 182729 DEBUG nova.virt.libvirt.vif [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1772729557',display_name='tempest-TestServerMultinode-server-1772729557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1772729557',id=186,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='648f17d42fa14c7a888033544026cf49',ramdisk_id='',reservation_id='r-xnzwfkm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1355577921',owner_user_name='tempest-TestServerMultinode-1355577921-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:59:29Z,user_data=None,user_id='30a80763458b43478ba0f621b8b501f5',uuid=73022dfe-7fd4-4b3c-908e-ccc0e2f26a24,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.436 182729 DEBUG nova.network.os_vif_util [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converting VIF {"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.437 182729 DEBUG nova.network.os_vif_util [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:40:9f,bridge_name='br-int',has_traffic_filtering=True,id=b65785f8-c4bc-4484-8367-99c33433e919,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb65785f8-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.438 182729 DEBUG nova.objects.instance [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lazy-loading 'pci_devices' on Instance uuid 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.456 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] End _get_guest_xml xml=<domain type="kvm">
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <uuid>73022dfe-7fd4-4b3c-908e-ccc0e2f26a24</uuid>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <name>instance-000000ba</name>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <metadata>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <nova:name>tempest-TestServerMultinode-server-1772729557</nova:name>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 22:59:32</nova:creationTime>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:user uuid="30a80763458b43478ba0f621b8b501f5">tempest-TestServerMultinode-1355577921-project-admin</nova:user>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:project uuid="648f17d42fa14c7a888033544026cf49">tempest-TestServerMultinode-1355577921</nova:project>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         <nova:port uuid="b65785f8-c4bc-4484-8367-99c33433e919">
Jan 22 22:59:32 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   </metadata>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <system>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <entry name="serial">73022dfe-7fd4-4b3c-908e-ccc0e2f26a24</entry>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <entry name="uuid">73022dfe-7fd4-4b3c-908e-ccc0e2f26a24</entry>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </system>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <os>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   </os>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <features>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <apic/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   </features>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   </clock>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   </cpu>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   <devices>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk.config"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </disk>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:23:40:9f"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <target dev="tapb65785f8-c4"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </interface>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/console.log" append="off"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </serial>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <video>
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </video>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </rng>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 22:59:32 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 22:59:32 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 22:59:32 compute-0 nova_compute[182725]:   </devices>
Jan 22 22:59:32 compute-0 nova_compute[182725]: </domain>
Jan 22 22:59:32 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.458 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Preparing to wait for external event network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.458 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.459 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.459 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.460 182729 DEBUG nova.virt.libvirt.vif [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1772729557',display_name='tempest-TestServerMultinode-server-1772729557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1772729557',id=186,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='648f17d42fa14c7a888033544026cf49',ramdisk_id='',reservation_id='r-xnzwfkm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1355577921',owner_user_name='tempest-TestServerMultinode-1355577921-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:59:29Z,user_data=None,user_id='30a80763458b43478ba0f621b8b501f5',uuid=73022dfe-7fd4-4b3c-908e-ccc0e2f26a24,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.461 182729 DEBUG nova.network.os_vif_util [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converting VIF {"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.462 182729 DEBUG nova.network.os_vif_util [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:40:9f,bridge_name='br-int',has_traffic_filtering=True,id=b65785f8-c4bc-4484-8367-99c33433e919,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb65785f8-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.462 182729 DEBUG os_vif [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:40:9f,bridge_name='br-int',has_traffic_filtering=True,id=b65785f8-c4bc-4484-8367-99c33433e919,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb65785f8-c4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.463 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.464 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.464 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.468 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.469 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb65785f8-c4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.469 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb65785f8-c4, col_values=(('external_ids', {'iface-id': 'b65785f8-c4bc-4484-8367-99c33433e919', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:40:9f', 'vm-uuid': '73022dfe-7fd4-4b3c-908e-ccc0e2f26a24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.471 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:32 compute-0 NetworkManager[54954]: <info>  [1769122772.4726] manager: (tapb65785f8-c4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.474 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.480 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.481 182729 INFO os_vif [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:40:9f,bridge_name='br-int',has_traffic_filtering=True,id=b65785f8-c4bc-4484-8367-99c33433e919,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb65785f8-c4')
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.547 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.548 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.548 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] No VIF found with MAC fa:16:3e:23:40:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 22:59:32 compute-0 nova_compute[182725]: 2026-01-22 22:59:32.549 182729 INFO nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Using config drive
Jan 22 22:59:34 compute-0 nova_compute[182725]: 2026-01-22 22:59:34.451 182729 INFO nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Creating config drive at /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk.config
Jan 22 22:59:34 compute-0 nova_compute[182725]: 2026-01-22 22:59:34.458 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_hg43qwy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 22:59:34 compute-0 nova_compute[182725]: 2026-01-22 22:59:34.589 182729 DEBUG oslo_concurrency.processutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_hg43qwy" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 22:59:34 compute-0 kernel: tapb65785f8-c4: entered promiscuous mode
Jan 22 22:59:34 compute-0 NetworkManager[54954]: <info>  [1769122774.6606] manager: (tapb65785f8-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/356)
Jan 22 22:59:34 compute-0 ovn_controller[94850]: 2026-01-22T22:59:34Z|00753|binding|INFO|Claiming lport b65785f8-c4bc-4484-8367-99c33433e919 for this chassis.
Jan 22 22:59:34 compute-0 ovn_controller[94850]: 2026-01-22T22:59:34Z|00754|binding|INFO|b65785f8-c4bc-4484-8367-99c33433e919: Claiming fa:16:3e:23:40:9f 10.100.0.6
Jan 22 22:59:34 compute-0 nova_compute[182725]: 2026-01-22 22:59:34.662 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:34 compute-0 nova_compute[182725]: 2026-01-22 22:59:34.665 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:34 compute-0 systemd-udevd[240473]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.704 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:40:9f 10.100.0.6'], port_security=['fa:16:3e:23:40:9f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '73022dfe-7fd4-4b3c-908e-ccc0e2f26a24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82a04532-65dc-4565-8faf-3e7913e3093d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '648f17d42fa14c7a888033544026cf49', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2b775d78-5eae-44d4-acd5-c2e8bb690b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd576e24-eb7f-4cfe-9779-094eaf513ece, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b65785f8-c4bc-4484-8367-99c33433e919) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.705 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b65785f8-c4bc-4484-8367-99c33433e919 in datapath 82a04532-65dc-4565-8faf-3e7913e3093d bound to our chassis
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.706 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82a04532-65dc-4565-8faf-3e7913e3093d
Jan 22 22:59:34 compute-0 systemd-machined[154006]: New machine qemu-79-instance-000000ba.
Jan 22 22:59:34 compute-0 nova_compute[182725]: 2026-01-22 22:59:34.716 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.718 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[48e4854d-4b8e-4c0e-aa34-4a651635abfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.718 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82a04532-61 in ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 22:59:34 compute-0 NetworkManager[54954]: <info>  [1769122774.7201] device (tapb65785f8-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 22:59:34 compute-0 NetworkManager[54954]: <info>  [1769122774.7207] device (tapb65785f8-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.720 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82a04532-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.721 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[fef46d66-bb68-4e7e-bb96-fd0d61ac9b8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.721 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[29dfb27d-3f76-42eb-8bdf-b2d022bb6b6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 systemd[1]: Started Virtual Machine qemu-79-instance-000000ba.
Jan 22 22:59:34 compute-0 ovn_controller[94850]: 2026-01-22T22:59:34Z|00755|binding|INFO|Setting lport b65785f8-c4bc-4484-8367-99c33433e919 ovn-installed in OVS
Jan 22 22:59:34 compute-0 ovn_controller[94850]: 2026-01-22T22:59:34Z|00756|binding|INFO|Setting lport b65785f8-c4bc-4484-8367-99c33433e919 up in Southbound
Jan 22 22:59:34 compute-0 nova_compute[182725]: 2026-01-22 22:59:34.725 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.735 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd0365d-1e5f-468a-a2b8-3b151177063c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.759 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8c86b63c-f26b-4296-ad87-086aa48a6161]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.792 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[023b016f-87d5-4a28-aa22-5fa17ff2ab0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.798 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[56ca8957-4224-4363-91a1-4ef0fa67e82f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 NetworkManager[54954]: <info>  [1769122774.8001] manager: (tap82a04532-60): new Veth device (/org/freedesktop/NetworkManager/Devices/357)
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.830 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6b98ba-a115-4db9-b092-8e1ef2827372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.834 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[52b758d4-b86c-4638-953e-badbae92ccdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 NetworkManager[54954]: <info>  [1769122774.8573] device (tap82a04532-60): carrier: link connected
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.866 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[ac85be0a-4da6-45e0-9415-3ac970a9541b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.891 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b400f6a9-d76e-493f-9e15-f9251a2e7ff5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82a04532-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:78:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641245, 'reachable_time': 15576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240507, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.914 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[879bf01b-9035-41f5-ab61-05fa6fdccf51]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:783b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 641245, 'tstamp': 641245}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240508, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.940 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[0ddf9a23-61ef-447b-a970-8d7395c114e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82a04532-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:78:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641245, 'reachable_time': 15576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240509, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:34 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:34.975 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[b6156b24-b789-47ea-b388-15d28f17125d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.039 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122775.0388842, 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.040 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] VM Started (Lifecycle Event)
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.055 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c49708-fd24-43f1-8ec6-16a474d24c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.057 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.058 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82a04532-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.059 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.060 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82a04532-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.062 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122775.039932, 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.063 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] VM Paused (Lifecycle Event)
Jan 22 22:59:35 compute-0 kernel: tap82a04532-60: entered promiscuous mode
Jan 22 22:59:35 compute-0 NetworkManager[54954]: <info>  [1769122775.0641] manager: (tap82a04532-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.064 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.068 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82a04532-60, col_values=(('external_ids', {'iface-id': 'e9aceb3b-3bda-4638-9b85-f2d0610c9277'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.070 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:35 compute-0 ovn_controller[94850]: 2026-01-22T22:59:35Z|00757|binding|INFO|Releasing lport e9aceb3b-3bda-4638-9b85-f2d0610c9277 from this chassis (sb_readonly=0)
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.072 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82a04532-65dc-4565-8faf-3e7913e3093d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82a04532-65dc-4565-8faf-3e7913e3093d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.073 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[33982f56-ae0f-47f5-8b9f-333df606b850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.075 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: global
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-82a04532-65dc-4565-8faf-3e7913e3093d
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/82a04532-65dc-4565-8faf-3e7913e3093d.pid.haproxy
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 82a04532-65dc-4565-8faf-3e7913e3093d
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 22:59:35 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:35.076 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'env', 'PROCESS_TAG=haproxy-82a04532-65dc-4565-8faf-3e7913e3093d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82a04532-65dc-4565-8faf-3e7913e3093d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.085 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.088 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.091 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.109 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:59:35 compute-0 podman[240547]: 2026-01-22 22:59:35.489727719 +0000 UTC m=+0.079147398 container create ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 22:59:35 compute-0 systemd[1]: Started libpod-conmon-ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2.scope.
Jan 22 22:59:35 compute-0 podman[240547]: 2026-01-22 22:59:35.452201496 +0000 UTC m=+0.041621225 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 22:59:35 compute-0 systemd[1]: Started libcrun container.
Jan 22 22:59:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc9e4e9d224f232a4237653ce65fdc13573778adba224c3b5f75f4986701d3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 22:59:35 compute-0 podman[240547]: 2026-01-22 22:59:35.582780762 +0000 UTC m=+0.172200471 container init ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 22:59:35 compute-0 podman[240547]: 2026-01-22 22:59:35.597164249 +0000 UTC m=+0.186583928 container start ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 22:59:35 compute-0 podman[240563]: 2026-01-22 22:59:35.628077288 +0000 UTC m=+0.083053466 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Jan 22 22:59:35 compute-0 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[240564]: [NOTICE]   (240591) : New worker (240609) forked
Jan 22 22:59:35 compute-0 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[240564]: [NOTICE]   (240591) : Loading success.
Jan 22 22:59:35 compute-0 podman[240560]: 2026-01-22 22:59:35.665673332 +0000 UTC m=+0.123735356 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 22:59:35 compute-0 nova_compute[182725]: 2026-01-22 22:59:35.735 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.196 182729 DEBUG nova.compute.manager [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received event network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.197 182729 DEBUG oslo_concurrency.lockutils [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.199 182729 DEBUG oslo_concurrency.lockutils [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.199 182729 DEBUG oslo_concurrency.lockutils [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.199 182729 DEBUG nova.compute.manager [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Processing event network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.200 182729 DEBUG nova.compute.manager [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received event network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.200 182729 DEBUG oslo_concurrency.lockutils [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.200 182729 DEBUG oslo_concurrency.lockutils [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.200 182729 DEBUG oslo_concurrency.lockutils [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.200 182729 DEBUG nova.compute.manager [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] No waiting events found dispatching network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.201 182729 WARNING nova.compute.manager [req-6e7ca601-3060-4b8e-a74a-e463c1dfff57 req-25ca3301-1ca8-4616-88b6-064a761970e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received unexpected event network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 for instance with vm_state building and task_state spawning.
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.201 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.206 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122776.2062087, 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.207 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] VM Resumed (Lifecycle Event)
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.209 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.212 182729 INFO nova.virt.libvirt.driver [-] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Instance spawned successfully.
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.212 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.231 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.238 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.241 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:59:36 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.241 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.242 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.243 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.243 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.244 182729 DEBUG nova.virt.libvirt.driver [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.255 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.314 182729 DEBUG nova.network.neutron [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Updated VIF entry in instance network info cache for port b65785f8-c4bc-4484-8367-99c33433e919. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.315 182729 DEBUG nova.network.neutron [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Updating instance_info_cache with network_info: [{"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.332 182729 INFO nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Took 6.41 seconds to spawn the instance on the hypervisor.
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.332 182729 DEBUG nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.335 182729 DEBUG oslo_concurrency.lockutils [req-f793ee57-1db9-430a-a9d4-9a7fd9f78dcc req-82f4f3f4-bca6-4f07-8499-1b8ef5169488 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.444 182729 INFO nova.compute.manager [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Took 7.03 seconds to build instance.
Jan 22 22:59:36 compute-0 nova_compute[182725]: 2026-01-22 22:59:36.465 182729 DEBUG oslo_concurrency.lockutils [None req-10d1e56f-a300-4245-b570-065b580f1e2c 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:37 compute-0 nova_compute[182725]: 2026-01-22 22:59:37.471 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:38 compute-0 nova_compute[182725]: 2026-01-22 22:59:38.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:40 compute-0 nova_compute[182725]: 2026-01-22 22:59:40.782 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:42 compute-0 nova_compute[182725]: 2026-01-22 22:59:42.474 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:44 compute-0 nova_compute[182725]: 2026-01-22 22:59:44.981 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:44 compute-0 nova_compute[182725]: 2026-01-22 22:59:44.981 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:44 compute-0 nova_compute[182725]: 2026-01-22 22:59:44.982 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:44 compute-0 nova_compute[182725]: 2026-01-22 22:59:44.982 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:44 compute-0 nova_compute[182725]: 2026-01-22 22:59:44.982 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:44 compute-0 nova_compute[182725]: 2026-01-22 22:59:44.996 182729 INFO nova.compute.manager [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Terminating instance
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.008 182729 DEBUG nova.compute.manager [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 22:59:45 compute-0 kernel: tapb65785f8-c4 (unregistering): left promiscuous mode
Jan 22 22:59:45 compute-0 NetworkManager[54954]: <info>  [1769122785.0280] device (tapb65785f8-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.035 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 ovn_controller[94850]: 2026-01-22T22:59:45Z|00758|binding|INFO|Releasing lport b65785f8-c4bc-4484-8367-99c33433e919 from this chassis (sb_readonly=0)
Jan 22 22:59:45 compute-0 ovn_controller[94850]: 2026-01-22T22:59:45Z|00759|binding|INFO|Setting lport b65785f8-c4bc-4484-8367-99c33433e919 down in Southbound
Jan 22 22:59:45 compute-0 ovn_controller[94850]: 2026-01-22T22:59:45Z|00760|binding|INFO|Removing iface tapb65785f8-c4 ovn-installed in OVS
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.045 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:40:9f 10.100.0.6'], port_security=['fa:16:3e:23:40:9f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '73022dfe-7fd4-4b3c-908e-ccc0e2f26a24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82a04532-65dc-4565-8faf-3e7913e3093d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '648f17d42fa14c7a888033544026cf49', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2b775d78-5eae-44d4-acd5-c2e8bb690b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd576e24-eb7f-4cfe-9779-094eaf513ece, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=b65785f8-c4bc-4484-8367-99c33433e919) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.047 104215 INFO neutron.agent.ovn.metadata.agent [-] Port b65785f8-c4bc-4484-8367-99c33433e919 in datapath 82a04532-65dc-4565-8faf-3e7913e3093d unbound from our chassis
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.048 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82a04532-65dc-4565-8faf-3e7913e3093d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.050 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[09ea7359-caec-4ec2-a226-5ecf30b044c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.050 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d namespace which is not needed anymore
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.051 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Jan 22 22:59:45 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ba.scope: Consumed 9.165s CPU time.
Jan 22 22:59:45 compute-0 systemd-machined[154006]: Machine qemu-79-instance-000000ba terminated.
Jan 22 22:59:45 compute-0 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[240564]: [NOTICE]   (240591) : haproxy version is 2.8.14-c23fe91
Jan 22 22:59:45 compute-0 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[240564]: [NOTICE]   (240591) : path to executable is /usr/sbin/haproxy
Jan 22 22:59:45 compute-0 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[240564]: [WARNING]  (240591) : Exiting Master process...
Jan 22 22:59:45 compute-0 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[240564]: [ALERT]    (240591) : Current worker (240609) exited with code 143 (Terminated)
Jan 22 22:59:45 compute-0 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[240564]: [WARNING]  (240591) : All workers exited. Exiting... (0)
Jan 22 22:59:45 compute-0 systemd[1]: libpod-ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2.scope: Deactivated successfully.
Jan 22 22:59:45 compute-0 podman[240647]: 2026-01-22 22:59:45.182815475 +0000 UTC m=+0.046946648 container died ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 22:59:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2-userdata-shm.mount: Deactivated successfully.
Jan 22 22:59:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-6bc9e4e9d224f232a4237653ce65fdc13573778adba224c3b5f75f4986701d3a-merged.mount: Deactivated successfully.
Jan 22 22:59:45 compute-0 podman[240647]: 2026-01-22 22:59:45.230031539 +0000 UTC m=+0.094162712 container cleanup ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.230 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.234 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 systemd[1]: libpod-conmon-ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2.scope: Deactivated successfully.
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.269 182729 INFO nova.virt.libvirt.driver [-] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Instance destroyed successfully.
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.270 182729 DEBUG nova.objects.instance [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lazy-loading 'resources' on Instance uuid 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 22:59:45 compute-0 podman[240684]: 2026-01-22 22:59:45.29485699 +0000 UTC m=+0.041653156 container remove ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.299 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[47469ae5-c3c6-4d40-b2b4-77ada839dc7e]: (4, ('Thu Jan 22 10:59:45 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d (ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2)\nac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2\nThu Jan 22 10:59:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d (ac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2)\nac7e3ec9529918bb7ca2d9c3f55f730bc651b59213e63b0e46ba569e5682d1b2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.300 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0e1a39-2bd9-49af-9713-85c04ea6f180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.301 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82a04532-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.303 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 kernel: tap82a04532-60: left promiscuous mode
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.315 182729 DEBUG nova.virt.libvirt.vif [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1772729557',display_name='tempest-TestServerMultinode-server-1772729557',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1772729557',id=186,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:59:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='648f17d42fa14c7a888033544026cf49',ramdisk_id='',reservation_id='r-xnzwfkm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1355577921',owner_user_name='tempest-TestServerMultinode-1355577921-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:59:36Z,user_data=None,user_id='30a80763458b43478ba0f621b8b501f5',uuid=73022dfe-7fd4-4b3c-908e-ccc0e2f26a24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.316 182729 DEBUG nova.network.os_vif_util [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converting VIF {"id": "b65785f8-c4bc-4484-8367-99c33433e919", "address": "fa:16:3e:23:40:9f", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb65785f8-c4", "ovs_interfaceid": "b65785f8-c4bc-4484-8367-99c33433e919", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.316 182729 DEBUG nova.network.os_vif_util [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:40:9f,bridge_name='br-int',has_traffic_filtering=True,id=b65785f8-c4bc-4484-8367-99c33433e919,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb65785f8-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.317 182729 DEBUG os_vif [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:40:9f,bridge_name='br-int',has_traffic_filtering=True,id=b65785f8-c4bc-4484-8367-99c33433e919,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb65785f8-c4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.318 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.319 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb65785f8-c4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.320 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.322 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.322 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.324 182729 INFO os_vif [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:40:9f,bridge_name='br-int',has_traffic_filtering=True,id=b65785f8-c4bc-4484-8367-99c33433e919,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb65785f8-c4')
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.325 182729 INFO nova.virt.libvirt.driver [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Deleting instance files /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24_del
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.325 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7737b9d5-b3b8-4d4e-8654-7e08a92571b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.326 182729 INFO nova.virt.libvirt.driver [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Deletion of /var/lib/nova/instances/73022dfe-7fd4-4b3c-908e-ccc0e2f26a24_del complete
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.349 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[d667e282-86ee-411b-9eb1-2994e5b8be34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.350 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4c92aec2-ccc9-4ff8-aeca-5da0895a5526]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.369 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ccaf8382-56dc-4640-96c3-1539c181f69b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641238, 'reachable_time': 34674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240708, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.371 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 22:59:45 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:45.372 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[328e2497-775c-4840-b035-904ad2eed1e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 22:59:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d82a04532\x2d65dc\x2d4565\x2d8faf\x2d3e7913e3093d.mount: Deactivated successfully.
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.415 182729 INFO nova.compute.manager [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.416 182729 DEBUG oslo.service.loopingcall [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.416 182729 DEBUG nova.compute.manager [-] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.416 182729 DEBUG nova.network.neutron [-] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.612 182729 DEBUG nova.compute.manager [req-26a8584d-2f30-4021-8552-6d3b29f63365 req-02ab3bc7-15ec-427f-9db9-56dfe84667e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received event network-vif-unplugged-b65785f8-c4bc-4484-8367-99c33433e919 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.613 182729 DEBUG oslo_concurrency.lockutils [req-26a8584d-2f30-4021-8552-6d3b29f63365 req-02ab3bc7-15ec-427f-9db9-56dfe84667e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.613 182729 DEBUG oslo_concurrency.lockutils [req-26a8584d-2f30-4021-8552-6d3b29f63365 req-02ab3bc7-15ec-427f-9db9-56dfe84667e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.614 182729 DEBUG oslo_concurrency.lockutils [req-26a8584d-2f30-4021-8552-6d3b29f63365 req-02ab3bc7-15ec-427f-9db9-56dfe84667e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.614 182729 DEBUG nova.compute.manager [req-26a8584d-2f30-4021-8552-6d3b29f63365 req-02ab3bc7-15ec-427f-9db9-56dfe84667e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] No waiting events found dispatching network-vif-unplugged-b65785f8-c4bc-4484-8367-99c33433e919 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.614 182729 DEBUG nova.compute.manager [req-26a8584d-2f30-4021-8552-6d3b29f63365 req-02ab3bc7-15ec-427f-9db9-56dfe84667e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received event network-vif-unplugged-b65785f8-c4bc-4484-8367-99c33433e919 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.783 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.905 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 22 22:59:45 compute-0 nova_compute[182725]: 2026-01-22 22:59:45.906 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.253 182729 DEBUG nova.network.neutron [-] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.281 182729 INFO nova.compute.manager [-] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Took 0.86 seconds to deallocate network for instance.
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.367 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.368 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.487 182729 DEBUG nova.compute.provider_tree [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.504 182729 DEBUG nova.scheduler.client.report [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.525 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.611 182729 INFO nova.scheduler.client.report [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Deleted allocations for instance 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.709 182729 DEBUG oslo_concurrency.lockutils [None req-cc7c963f-7544-48fa-b990-bf5861b1c8bf 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.903 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.904 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.904 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.904 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 22:59:46 compute-0 nova_compute[182725]: 2026-01-22 22:59:46.919 182729 DEBUG nova.compute.manager [req-850aebe3-4638-4bb5-827e-1e8f24ac108f req-bf5490fd-aca7-4b7c-b813-1158ad417911 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received event network-vif-deleted-b65785f8-c4bc-4484-8367-99c33433e919 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.084 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.085 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5632MB free_disk=73.31621551513672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.085 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.085 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.135 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.136 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.162 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.176 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.199 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.199 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.706 182729 DEBUG nova.compute.manager [req-382e1706-b49b-46fb-9ad3-c04f302aa44b req-182e7506-ca74-490a-a396-2c37d22b77cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received event network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.706 182729 DEBUG oslo_concurrency.lockutils [req-382e1706-b49b-46fb-9ad3-c04f302aa44b req-182e7506-ca74-490a-a396-2c37d22b77cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.707 182729 DEBUG oslo_concurrency.lockutils [req-382e1706-b49b-46fb-9ad3-c04f302aa44b req-182e7506-ca74-490a-a396-2c37d22b77cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.707 182729 DEBUG oslo_concurrency.lockutils [req-382e1706-b49b-46fb-9ad3-c04f302aa44b req-182e7506-ca74-490a-a396-2c37d22b77cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73022dfe-7fd4-4b3c-908e-ccc0e2f26a24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.707 182729 DEBUG nova.compute.manager [req-382e1706-b49b-46fb-9ad3-c04f302aa44b req-182e7506-ca74-490a-a396-2c37d22b77cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] No waiting events found dispatching network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 22:59:47 compute-0 nova_compute[182725]: 2026-01-22 22:59:47.707 182729 WARNING nova.compute.manager [req-382e1706-b49b-46fb-9ad3-c04f302aa44b req-182e7506-ca74-490a-a396-2c37d22b77cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Received unexpected event network-vif-plugged-b65785f8-c4bc-4484-8367-99c33433e919 for instance with vm_state deleted and task_state None.
Jan 22 22:59:48 compute-0 podman[240710]: 2026-01-22 22:59:48.128018436 +0000 UTC m=+0.057988803 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 22:59:48 compute-0 podman[240711]: 2026-01-22 22:59:48.129613895 +0000 UTC m=+0.057278394 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 22:59:48 compute-0 podman[240712]: 2026-01-22 22:59:48.142830834 +0000 UTC m=+0.058154836 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 22:59:48 compute-0 nova_compute[182725]: 2026-01-22 22:59:48.200 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:48 compute-0 nova_compute[182725]: 2026-01-22 22:59:48.200 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:48 compute-0 nova_compute[182725]: 2026-01-22 22:59:48.200 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:50 compute-0 nova_compute[182725]: 2026-01-22 22:59:50.321 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:50 compute-0 nova_compute[182725]: 2026-01-22 22:59:50.785 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:50 compute-0 nova_compute[182725]: 2026-01-22 22:59:50.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:50 compute-0 nova_compute[182725]: 2026-01-22 22:59:50.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 22:59:51 compute-0 nova_compute[182725]: 2026-01-22 22:59:51.436 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:51.436 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 22:59:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:51.438 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 22:59:51 compute-0 ovn_metadata_agent[104210]: 2026-01-22 22:59:51.439 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 22:59:52 compute-0 nova_compute[182725]: 2026-01-22 22:59:52.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:52 compute-0 nova_compute[182725]: 2026-01-22 22:59:52.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 22:59:53 compute-0 nova_compute[182725]: 2026-01-22 22:59:53.905 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 22:59:54 compute-0 nova_compute[182725]: 2026-01-22 22:59:54.139 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:55 compute-0 nova_compute[182725]: 2026-01-22 22:59:55.322 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:55 compute-0 nova_compute[182725]: 2026-01-22 22:59:55.788 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 22:59:55 compute-0 nova_compute[182725]: 2026-01-22 22:59:55.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:00 compute-0 nova_compute[182725]: 2026-01-22 23:00:00.269 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122785.2674248, 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 23:00:00 compute-0 nova_compute[182725]: 2026-01-22 23:00:00.269 182729 INFO nova.compute.manager [-] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] VM Stopped (Lifecycle Event)
Jan 22 23:00:00 compute-0 nova_compute[182725]: 2026-01-22 23:00:00.297 182729 DEBUG nova.compute.manager [None req-6feca66d-4431-4f87-8a15-74ccf317baed - - - - - -] [instance: 73022dfe-7fd4-4b3c-908e-ccc0e2f26a24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:00:00 compute-0 nova_compute[182725]: 2026-01-22 23:00:00.323 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:00 compute-0 nova_compute[182725]: 2026-01-22 23:00:00.789 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:01 compute-0 podman[240777]: 2026-01-22 23:00:01.181719558 +0000 UTC m=+0.105521334 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 23:00:05 compute-0 nova_compute[182725]: 2026-01-22 23:00:05.325 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:05 compute-0 nova_compute[182725]: 2026-01-22 23:00:05.790 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:06 compute-0 podman[240800]: 2026-01-22 23:00:06.177868282 +0000 UTC m=+0.096983921 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 22 23:00:06 compute-0 podman[240799]: 2026-01-22 23:00:06.184395485 +0000 UTC m=+0.108281373 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 23:00:07 compute-0 nova_compute[182725]: 2026-01-22 23:00:07.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:07 compute-0 nova_compute[182725]: 2026-01-22 23:00:07.899 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:00:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:00:10 compute-0 nova_compute[182725]: 2026-01-22 23:00:10.326 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:10 compute-0 nova_compute[182725]: 2026-01-22 23:00:10.829 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:12.469 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:12.469 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:12.470 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:14 compute-0 nova_compute[182725]: 2026-01-22 23:00:14.897 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:14 compute-0 nova_compute[182725]: 2026-01-22 23:00:14.899 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 23:00:15 compute-0 nova_compute[182725]: 2026-01-22 23:00:15.216 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 23:00:15 compute-0 nova_compute[182725]: 2026-01-22 23:00:15.328 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:15 compute-0 nova_compute[182725]: 2026-01-22 23:00:15.831 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:19 compute-0 podman[240843]: 2026-01-22 23:00:19.118629788 +0000 UTC m=+0.056049554 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 23:00:19 compute-0 podman[240844]: 2026-01-22 23:00:19.132610716 +0000 UTC m=+0.054008564 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 23:00:19 compute-0 podman[240845]: 2026-01-22 23:00:19.148127841 +0000 UTC m=+0.063359835 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 23:00:20 compute-0 nova_compute[182725]: 2026-01-22 23:00:20.330 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:20 compute-0 nova_compute[182725]: 2026-01-22 23:00:20.834 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:25 compute-0 nova_compute[182725]: 2026-01-22 23:00:25.331 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:25 compute-0 nova_compute[182725]: 2026-01-22 23:00:25.836 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:27 compute-0 nova_compute[182725]: 2026-01-22 23:00:27.900 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "49a28248-68bf-4f1c-a38f-01483bb35cf8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:27 compute-0 nova_compute[182725]: 2026-01-22 23:00:27.901 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:27 compute-0 nova_compute[182725]: 2026-01-22 23:00:27.922 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.037 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.038 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.048 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.049 182729 INFO nova.compute.claims [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Claim successful on node compute-0.ctlplane.example.com
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.190 182729 DEBUG nova.compute.provider_tree [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:00:28 compute-0 ovn_controller[94850]: 2026-01-22T23:00:28Z|00761|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.205 182729 DEBUG nova.scheduler.client.report [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.228 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.230 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.308 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.309 182729 DEBUG nova.network.neutron [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.330 182729 INFO nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.347 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.481 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.482 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.482 182729 INFO nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Creating image(s)
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.483 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "/var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.483 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "/var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.484 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "/var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.496 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.582 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.583 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.584 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.597 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.664 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.665 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.702 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.704 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.704 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.762 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.763 182729 DEBUG nova.virt.disk.api [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Checking if we can resize image /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.763 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.857 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.859 182729 DEBUG nova.virt.disk.api [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Cannot resize image /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.860 182729 DEBUG nova.objects.instance [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lazy-loading 'migration_context' on Instance uuid 49a28248-68bf-4f1c-a38f-01483bb35cf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.873 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.873 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Ensure instance console log exists: /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.874 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.875 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:28 compute-0 nova_compute[182725]: 2026-01-22 23:00:28.875 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:29 compute-0 nova_compute[182725]: 2026-01-22 23:00:29.454 182729 DEBUG nova.policy [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd5c61ac7024e4bf483db50ec6fa503cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63a8d5918d0a40388e1827e68bbd7188', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 23:00:29 compute-0 sshd-session[240925]: Connection closed by 106.63.7.208 port 43894
Jan 22 23:00:30 compute-0 nova_compute[182725]: 2026-01-22 23:00:30.332 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:30 compute-0 nova_compute[182725]: 2026-01-22 23:00:30.870 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:32 compute-0 nova_compute[182725]: 2026-01-22 23:00:32.064 182729 DEBUG nova.network.neutron [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Successfully created port: 74821d1a-2c78-4cec-90ae-32a1aa1c77bc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 23:00:32 compute-0 podman[240926]: 2026-01-22 23:00:32.137033282 +0000 UTC m=+0.069033707 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 23:00:34 compute-0 nova_compute[182725]: 2026-01-22 23:00:34.507 182729 DEBUG nova.network.neutron [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Successfully updated port: 74821d1a-2c78-4cec-90ae-32a1aa1c77bc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 23:00:34 compute-0 nova_compute[182725]: 2026-01-22 23:00:34.527 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 23:00:34 compute-0 nova_compute[182725]: 2026-01-22 23:00:34.528 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquired lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 23:00:34 compute-0 nova_compute[182725]: 2026-01-22 23:00:34.528 182729 DEBUG nova.network.neutron [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 23:00:34 compute-0 nova_compute[182725]: 2026-01-22 23:00:34.642 182729 DEBUG nova.compute.manager [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received event network-changed-74821d1a-2c78-4cec-90ae-32a1aa1c77bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 23:00:34 compute-0 nova_compute[182725]: 2026-01-22 23:00:34.643 182729 DEBUG nova.compute.manager [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Refreshing instance network info cache due to event network-changed-74821d1a-2c78-4cec-90ae-32a1aa1c77bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 23:00:34 compute-0 nova_compute[182725]: 2026-01-22 23:00:34.643 182729 DEBUG oslo_concurrency.lockutils [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 23:00:35 compute-0 nova_compute[182725]: 2026-01-22 23:00:35.334 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:35 compute-0 nova_compute[182725]: 2026-01-22 23:00:35.460 182729 DEBUG nova.network.neutron [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 23:00:35 compute-0 nova_compute[182725]: 2026-01-22 23:00:35.873 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.660 182729 DEBUG nova.network.neutron [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updating instance_info_cache with network_info: [{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.679 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Releasing lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.679 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance network_info: |[{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.680 182729 DEBUG oslo_concurrency.lockutils [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.681 182729 DEBUG nova.network.neutron [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Refreshing network info cache for port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.686 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Start _get_guest_xml network_info=[{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.694 182729 WARNING nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.705 182729 DEBUG nova.virt.libvirt.host [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.706 182729 DEBUG nova.virt.libvirt.host [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.711 182729 DEBUG nova.virt.libvirt.host [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.712 182729 DEBUG nova.virt.libvirt.host [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.714 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.715 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.716 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.716 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.717 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.718 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.718 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.719 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.720 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.720 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.720 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.721 182729 DEBUG nova.virt.hardware [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.729 182729 DEBUG nova.virt.libvirt.vif [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T23:00:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-893723156',display_name='tempest-TestShelveInstance-server-893723156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-893723156',id=188,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC3HtHpV1FD2iesqzTdA4pNZO4ZemGeAj/PJq6gQTLKoxhM8z95o7fm8rT3rH9LD9Kyoq2hFFqCG8KrM/FAzOzMggGNvnKC1BVAxCZoE730yH6SjrGFZAOsvREtyXKBxKA==',key_name='tempest-TestShelveInstance-205565771',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63a8d5918d0a40388e1827e68bbd7188',ramdisk_id='',reservation_id='r-a589qtli',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1247982194',owner_user_name='tempest-TestShelveInstance-1247982194-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T23:00:28Z,user_data=None,user_id='d5c61ac7024e4bf483db50ec6fa503cd',uuid=49a28248-68bf-4f1c-a38f-01483bb35cf8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.730 182729 DEBUG nova.network.os_vif_util [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Converting VIF {"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.731 182729 DEBUG nova.network.os_vif_util [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:bc:e1,bridge_name='br-int',has_traffic_filtering=True,id=74821d1a-2c78-4cec-90ae-32a1aa1c77bc,network=Network(61dce738-581f-4a17-826d-74026fc4bd1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74821d1a-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.733 182729 DEBUG nova.objects.instance [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49a28248-68bf-4f1c-a38f-01483bb35cf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.750 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <uuid>49a28248-68bf-4f1c-a38f-01483bb35cf8</uuid>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <name>instance-000000bc</name>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <memory>131072</memory>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <vcpu>1</vcpu>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <metadata>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <nova:name>tempest-TestShelveInstance-server-893723156</nova:name>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <nova:creationTime>2026-01-22 23:00:36</nova:creationTime>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <nova:flavor name="m1.nano">
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:memory>128</nova:memory>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:disk>1</nova:disk>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:swap>0</nova:swap>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:vcpus>1</nova:vcpus>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       </nova:flavor>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <nova:owner>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:user uuid="d5c61ac7024e4bf483db50ec6fa503cd">tempest-TestShelveInstance-1247982194-project-member</nova:user>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:project uuid="63a8d5918d0a40388e1827e68bbd7188">tempest-TestShelveInstance-1247982194</nova:project>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       </nova:owner>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <nova:ports>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         <nova:port uuid="74821d1a-2c78-4cec-90ae-32a1aa1c77bc">
Jan 22 23:00:36 compute-0 nova_compute[182725]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:         </nova:port>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       </nova:ports>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </nova:instance>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   </metadata>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <sysinfo type="smbios">
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <system>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <entry name="manufacturer">RDO</entry>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <entry name="product">OpenStack Compute</entry>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <entry name="serial">49a28248-68bf-4f1c-a38f-01483bb35cf8</entry>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <entry name="uuid">49a28248-68bf-4f1c-a38f-01483bb35cf8</entry>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <entry name="family">Virtual Machine</entry>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </system>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   </sysinfo>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <os>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <boot dev="hd"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <smbios mode="sysinfo"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   </os>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <features>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <acpi/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <apic/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <vmcoreinfo/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   </features>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <clock offset="utc">
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <timer name="hpet" present="no"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   </clock>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <cpu mode="custom" match="exact">
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <model>Nehalem</model>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   </cpu>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   <devices>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <disk type="file" device="disk">
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <target dev="vda" bus="virtio"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </disk>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <disk type="file" device="cdrom">
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <source file="/var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk.config"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <target dev="sda" bus="sata"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </disk>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <interface type="ethernet">
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <mac address="fa:16:3e:bc:bc:e1"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <mtu size="1442"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <target dev="tap74821d1a-2c"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </interface>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <serial type="pty">
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <log file="/var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/console.log" append="off"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </serial>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <video>
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <model type="virtio"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </video>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <input type="tablet" bus="usb"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <rng model="virtio">
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <backend model="random">/dev/urandom</backend>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </rng>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <controller type="usb" index="0"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     <memballoon model="virtio">
Jan 22 23:00:36 compute-0 nova_compute[182725]:       <stats period="10"/>
Jan 22 23:00:36 compute-0 nova_compute[182725]:     </memballoon>
Jan 22 23:00:36 compute-0 nova_compute[182725]:   </devices>
Jan 22 23:00:36 compute-0 nova_compute[182725]: </domain>
Jan 22 23:00:36 compute-0 nova_compute[182725]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.752 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Preparing to wait for external event network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.753 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.753 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.753 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.755 182729 DEBUG nova.virt.libvirt.vif [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T23:00:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-893723156',display_name='tempest-TestShelveInstance-server-893723156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-893723156',id=188,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC3HtHpV1FD2iesqzTdA4pNZO4ZemGeAj/PJq6gQTLKoxhM8z95o7fm8rT3rH9LD9Kyoq2hFFqCG8KrM/FAzOzMggGNvnKC1BVAxCZoE730yH6SjrGFZAOsvREtyXKBxKA==',key_name='tempest-TestShelveInstance-205565771',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63a8d5918d0a40388e1827e68bbd7188',ramdisk_id='',reservation_id='r-a589qtli',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1247982194',owner_user_name='tempest-TestShelveInstance-1247982194-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T23:00:28Z,user_data=None,user_id='d5c61ac7024e4bf483db50ec6fa503cd',uuid=49a28248-68bf-4f1c-a38f-01483bb35cf8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.755 182729 DEBUG nova.network.os_vif_util [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Converting VIF {"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.756 182729 DEBUG nova.network.os_vif_util [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:bc:e1,bridge_name='br-int',has_traffic_filtering=True,id=74821d1a-2c78-4cec-90ae-32a1aa1c77bc,network=Network(61dce738-581f-4a17-826d-74026fc4bd1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74821d1a-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.757 182729 DEBUG os_vif [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:bc:e1,bridge_name='br-int',has_traffic_filtering=True,id=74821d1a-2c78-4cec-90ae-32a1aa1c77bc,network=Network(61dce738-581f-4a17-826d-74026fc4bd1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74821d1a-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.758 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.758 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.759 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.763 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.763 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74821d1a-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.764 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74821d1a-2c, col_values=(('external_ids', {'iface-id': '74821d1a-2c78-4cec-90ae-32a1aa1c77bc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:bc:e1', 'vm-uuid': '49a28248-68bf-4f1c-a38f-01483bb35cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.766 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:36 compute-0 NetworkManager[54954]: <info>  [1769122836.7675] manager: (tap74821d1a-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.770 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.776 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.777 182729 INFO os_vif [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:bc:e1,bridge_name='br-int',has_traffic_filtering=True,id=74821d1a-2c78-4cec-90ae-32a1aa1c77bc,network=Network(61dce738-581f-4a17-826d-74026fc4bd1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74821d1a-2c')
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.823 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.824 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.824 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] No VIF found with MAC fa:16:3e:bc:bc:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 23:00:36 compute-0 nova_compute[182725]: 2026-01-22 23:00:36.824 182729 INFO nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Using config drive
Jan 22 23:00:37 compute-0 podman[240950]: 2026-01-22 23:00:37.164651111 +0000 UTC m=+0.081506676 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 23:00:37 compute-0 podman[240949]: 2026-01-22 23:00:37.224393815 +0000 UTC m=+0.148914721 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 23:00:37 compute-0 nova_compute[182725]: 2026-01-22 23:00:37.578 182729 INFO nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Creating config drive at /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk.config
Jan 22 23:00:37 compute-0 nova_compute[182725]: 2026-01-22 23:00:37.588 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq9_ej4c9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:37 compute-0 nova_compute[182725]: 2026-01-22 23:00:37.722 182729 DEBUG oslo_concurrency.processutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq9_ej4c9" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:37 compute-0 kernel: tap74821d1a-2c: entered promiscuous mode
Jan 22 23:00:37 compute-0 NetworkManager[54954]: <info>  [1769122837.8005] manager: (tap74821d1a-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/360)
Jan 22 23:00:37 compute-0 ovn_controller[94850]: 2026-01-22T23:00:37Z|00762|binding|INFO|Claiming lport 74821d1a-2c78-4cec-90ae-32a1aa1c77bc for this chassis.
Jan 22 23:00:37 compute-0 ovn_controller[94850]: 2026-01-22T23:00:37Z|00763|binding|INFO|74821d1a-2c78-4cec-90ae-32a1aa1c77bc: Claiming fa:16:3e:bc:bc:e1 10.100.0.8
Jan 22 23:00:37 compute-0 nova_compute[182725]: 2026-01-22 23:00:37.801 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:37 compute-0 systemd-udevd[241011]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 23:00:37 compute-0 systemd-machined[154006]: New machine qemu-80-instance-000000bc.
Jan 22 23:00:37 compute-0 NetworkManager[54954]: <info>  [1769122837.8529] device (tap74821d1a-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 23:00:37 compute-0 NetworkManager[54954]: <info>  [1769122837.8539] device (tap74821d1a-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 23:00:37 compute-0 nova_compute[182725]: 2026-01-22 23:00:37.887 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:37 compute-0 ovn_controller[94850]: 2026-01-22T23:00:37Z|00764|binding|INFO|Setting lport 74821d1a-2c78-4cec-90ae-32a1aa1c77bc ovn-installed in OVS
Jan 22 23:00:37 compute-0 nova_compute[182725]: 2026-01-22 23:00:37.893 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:37 compute-0 systemd[1]: Started Virtual Machine qemu-80-instance-000000bc.
Jan 22 23:00:37 compute-0 ovn_controller[94850]: 2026-01-22T23:00:37Z|00765|binding|INFO|Setting lport 74821d1a-2c78-4cec-90ae-32a1aa1c77bc up in Southbound
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.952 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:bc:e1 10.100.0.8'], port_security=['fa:16:3e:bc:bc:e1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '49a28248-68bf-4f1c-a38f-01483bb35cf8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61dce738-581f-4a17-826d-74026fc4bd1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63a8d5918d0a40388e1827e68bbd7188', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f49bec43-bd9f-46cd-9803-de9f64df03d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e7715dc-149e-449b-b509-948ea347b840, chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=74821d1a-2c78-4cec-90ae-32a1aa1c77bc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.953 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc in datapath 61dce738-581f-4a17-826d-74026fc4bd1e bound to our chassis
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.955 104215 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61dce738-581f-4a17-826d-74026fc4bd1e
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.966 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa115d5-fdbd-4440-aafb-074ab746d485]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.967 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61dce738-51 in ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.968 211671 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61dce738-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.969 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1f47e2c8-49cf-491b-9466-bdd6d08e12cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.970 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[e3630ce4-9cd7-4ef2-b3e9-fe9ff5f78b67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:37 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:37.987 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf1e648-2dbe-4a12-a1be-4eb3c9220199]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.002 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[4ce1aa1c-ba71-4cee-ada7-1802947c975a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.042 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a87801-e3ca-48b9-a8d0-9ac30cad78d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 NetworkManager[54954]: <info>  [1769122838.0486] manager: (tap61dce738-50): new Veth device (/org/freedesktop/NetworkManager/Devices/361)
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.050 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[97a3e44f-b97e-41ad-943d-dcc42855b509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.087 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb4d237-bf9f-48b6-b899-35bd19e5ba74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.090 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[59c51747-b39d-4832-8487-a9847ea379df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 NetworkManager[54954]: <info>  [1769122838.1166] device (tap61dce738-50): carrier: link connected
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.121 211685 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c8e529-5d21-4450-9de4-e1ea5d8a7b21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.138 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1386402f-63cf-418e-b456-7654975bd5b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61dce738-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:30:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647571, 'reachable_time': 34163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241051, 'error': None, 'target': 'ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.153 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[c798f96e-a790-4397-8cf9-dda0877c9d40]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:3035'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647571, 'tstamp': 647571}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241053, 'error': None, 'target': 'ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.170 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122838.1696627, 49a28248-68bf-4f1c-a38f-01483bb35cf8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.170 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] VM Started (Lifecycle Event)
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.174 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1edf7f-57a4-4e8f-85ae-b92bba250083]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61dce738-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:30:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647571, 'reachable_time': 34163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241054, 'error': None, 'target': 'ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.203 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.204 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0a2342-13cc-4b35-b777-0b35be44fe2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.206 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122838.172813, 49a28248-68bf-4f1c-a38f-01483bb35cf8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.207 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] VM Paused (Lifecycle Event)
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.230 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.234 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.257 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.258 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[1459bec8-d73a-44ae-babf-07d45fdaa1e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.260 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61dce738-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.260 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.261 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61dce738-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.263 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:38 compute-0 NetworkManager[54954]: <info>  [1769122838.2639] manager: (tap61dce738-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Jan 22 23:00:38 compute-0 kernel: tap61dce738-50: entered promiscuous mode
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.267 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61dce738-50, col_values=(('external_ids', {'iface-id': '3dd62629-b338-4c53-923f-9914324106d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:00:38 compute-0 ovn_controller[94850]: 2026-01-22T23:00:38Z|00766|binding|INFO|Releasing lport 3dd62629-b338-4c53-923f-9914324106d2 from this chassis (sb_readonly=0)
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.270 104215 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61dce738-581f-4a17-826d-74026fc4bd1e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61dce738-581f-4a17-826d-74026fc4bd1e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.271 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[93bf925e-6a49-4b62-8043-0132f9a236e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.271 104215 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: global
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     log         /dev/log local0 debug
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     log-tag     haproxy-metadata-proxy-61dce738-581f-4a17-826d-74026fc4bd1e
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     user        root
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     group       root
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     maxconn     1024
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     pidfile     /var/lib/neutron/external/pids/61dce738-581f-4a17-826d-74026fc4bd1e.pid.haproxy
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     daemon
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: defaults
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     log global
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     mode http
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     option httplog
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     option dontlognull
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     option http-server-close
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     option forwardfor
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     retries                 3
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     timeout http-request    30s
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     timeout connect         30s
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     timeout client          32s
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     timeout server          32s
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     timeout http-keep-alive 30s
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: listen listener
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     bind 169.254.169.254:80
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:     http-request add-header X-OVN-Network-ID 61dce738-581f-4a17-826d-74026fc4bd1e
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 23:00:38 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:00:38.272 104215 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e', 'env', 'PROCESS_TAG=haproxy-61dce738-581f-4a17-826d-74026fc4bd1e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61dce738-581f-4a17-826d-74026fc4bd1e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.272 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.275 182729 DEBUG nova.compute.manager [req-dde5256f-ebc1-482c-bb4f-1f35479cb47e req-8f2ea4ca-9494-46d0-a92b-310a09328556 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received event network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.276 182729 DEBUG oslo_concurrency.lockutils [req-dde5256f-ebc1-482c-bb4f-1f35479cb47e req-8f2ea4ca-9494-46d0-a92b-310a09328556 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.276 182729 DEBUG oslo_concurrency.lockutils [req-dde5256f-ebc1-482c-bb4f-1f35479cb47e req-8f2ea4ca-9494-46d0-a92b-310a09328556 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.277 182729 DEBUG oslo_concurrency.lockutils [req-dde5256f-ebc1-482c-bb4f-1f35479cb47e req-8f2ea4ca-9494-46d0-a92b-310a09328556 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.277 182729 DEBUG nova.compute.manager [req-dde5256f-ebc1-482c-bb4f-1f35479cb47e req-8f2ea4ca-9494-46d0-a92b-310a09328556 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Processing event network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.278 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.279 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.281 182729 DEBUG nova.virt.driver [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] Emitting event <LifecycleEvent: 1769122838.2809615, 49a28248-68bf-4f1c-a38f-01483bb35cf8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.281 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] VM Resumed (Lifecycle Event)
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.283 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.286 182729 INFO nova.virt.libvirt.driver [-] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance spawned successfully.
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.287 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.301 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.304 182729 DEBUG nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.313 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.313 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.314 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.315 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.315 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.316 182729 DEBUG nova.virt.libvirt.driver [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.321 182729 INFO nova.compute.manager [None req-85690915-a976-4d1f-9599-4e9730e99825 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.374 182729 INFO nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Took 9.89 seconds to spawn the instance on the hypervisor.
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.374 182729 DEBUG nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.488 182729 INFO nova.compute.manager [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Took 10.50 seconds to build instance.
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.506 182729 DEBUG oslo_concurrency.lockutils [None req-9fe5b93a-7d4e-492b-8f0e-6f5b2e22efcc d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.581 182729 DEBUG nova.network.neutron [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updated VIF entry in instance network info cache for port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.581 182729 DEBUG nova.network.neutron [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updating instance_info_cache with network_info: [{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 23:00:38 compute-0 nova_compute[182725]: 2026-01-22 23:00:38.596 182729 DEBUG oslo_concurrency.lockutils [req-1a7cb01f-ab15-417e-a008-2a0d09c1df43 req-be0ceb1d-fd38-44ed-99f3-849f9437491d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 23:00:38 compute-0 podman[241086]: 2026-01-22 23:00:38.703832276 +0000 UTC m=+0.060322930 container create 5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 23:00:38 compute-0 systemd[1]: Started libpod-conmon-5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84.scope.
Jan 22 23:00:38 compute-0 podman[241086]: 2026-01-22 23:00:38.674361303 +0000 UTC m=+0.030851987 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 23:00:38 compute-0 systemd[1]: Started libcrun container.
Jan 22 23:00:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87722afd9bcde705aa44f0599dc473c6719134827529520f005123d993b8309d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 23:00:38 compute-0 podman[241086]: 2026-01-22 23:00:38.787326811 +0000 UTC m=+0.143817555 container init 5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 23:00:38 compute-0 podman[241086]: 2026-01-22 23:00:38.794071149 +0000 UTC m=+0.150561833 container start 5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 23:00:38 compute-0 neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e[241103]: [NOTICE]   (241107) : New worker (241109) forked
Jan 22 23:00:38 compute-0 neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e[241103]: [NOTICE]   (241107) : Loading success.
Jan 22 23:00:40 compute-0 nova_compute[182725]: 2026-01-22 23:00:40.367 182729 DEBUG nova.compute.manager [req-abad39c4-8de9-4aac-8634-2176d8a30206 req-e7e03cb7-221d-4d02-9c21-1982b92b6620 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received event network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 23:00:40 compute-0 nova_compute[182725]: 2026-01-22 23:00:40.367 182729 DEBUG oslo_concurrency.lockutils [req-abad39c4-8de9-4aac-8634-2176d8a30206 req-e7e03cb7-221d-4d02-9c21-1982b92b6620 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:40 compute-0 nova_compute[182725]: 2026-01-22 23:00:40.368 182729 DEBUG oslo_concurrency.lockutils [req-abad39c4-8de9-4aac-8634-2176d8a30206 req-e7e03cb7-221d-4d02-9c21-1982b92b6620 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:40 compute-0 nova_compute[182725]: 2026-01-22 23:00:40.368 182729 DEBUG oslo_concurrency.lockutils [req-abad39c4-8de9-4aac-8634-2176d8a30206 req-e7e03cb7-221d-4d02-9c21-1982b92b6620 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:40 compute-0 nova_compute[182725]: 2026-01-22 23:00:40.368 182729 DEBUG nova.compute.manager [req-abad39c4-8de9-4aac-8634-2176d8a30206 req-e7e03cb7-221d-4d02-9c21-1982b92b6620 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] No waiting events found dispatching network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 23:00:40 compute-0 nova_compute[182725]: 2026-01-22 23:00:40.368 182729 WARNING nova.compute.manager [req-abad39c4-8de9-4aac-8634-2176d8a30206 req-e7e03cb7-221d-4d02-9c21-1982b92b6620 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received unexpected event network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc for instance with vm_state active and task_state None.
Jan 22 23:00:40 compute-0 nova_compute[182725]: 2026-01-22 23:00:40.878 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:41 compute-0 nova_compute[182725]: 2026-01-22 23:00:41.203 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:41 compute-0 nova_compute[182725]: 2026-01-22 23:00:41.767 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:45 compute-0 nova_compute[182725]: 2026-01-22 23:00:45.878 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:45 compute-0 nova_compute[182725]: 2026-01-22 23:00:45.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:45 compute-0 nova_compute[182725]: 2026-01-22 23:00:45.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:00:45 compute-0 nova_compute[182725]: 2026-01-22 23:00:45.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.208 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.208 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquired lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.208 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.208 182729 DEBUG nova.objects.instance [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 49a28248-68bf-4f1c-a38f-01483bb35cf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.362 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:46 compute-0 NetworkManager[54954]: <info>  [1769122846.3636] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Jan 22 23:00:46 compute-0 NetworkManager[54954]: <info>  [1769122846.3645] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Jan 22 23:00:46 compute-0 ovn_controller[94850]: 2026-01-22T23:00:46Z|00767|binding|INFO|Releasing lport 3dd62629-b338-4c53-923f-9914324106d2 from this chassis (sb_readonly=0)
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.425 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.433 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.681 182729 DEBUG nova.compute.manager [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received event network-changed-74821d1a-2c78-4cec-90ae-32a1aa1c77bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.682 182729 DEBUG nova.compute.manager [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Refreshing instance network info cache due to event network-changed-74821d1a-2c78-4cec-90ae-32a1aa1c77bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.683 182729 DEBUG oslo_concurrency.lockutils [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 23:00:46 compute-0 nova_compute[182725]: 2026-01-22 23:00:46.769 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.133 182729 DEBUG nova.network.neutron [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updating instance_info_cache with network_info: [{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.150 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Releasing lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.151 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.151 182729 DEBUG oslo_concurrency.lockutils [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.152 182729 DEBUG nova.network.neutron [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Refreshing network info cache for port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.153 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.153 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.173 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.173 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.174 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.174 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.237 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.304 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.305 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.370 182729 DEBUG oslo_concurrency.processutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.515 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.518 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5508MB free_disk=73.31537246704102GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.518 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.519 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.584 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Instance 49a28248-68bf-4f1c-a38f-01483bb35cf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.585 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.585 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.624 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.637 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.656 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:00:48 compute-0 nova_compute[182725]: 2026-01-22 23:00:48.657 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:00:49 compute-0 nova_compute[182725]: 2026-01-22 23:00:49.723 182729 DEBUG nova.network.neutron [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updated VIF entry in instance network info cache for port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 23:00:49 compute-0 nova_compute[182725]: 2026-01-22 23:00:49.724 182729 DEBUG nova.network.neutron [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updating instance_info_cache with network_info: [{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 23:00:49 compute-0 nova_compute[182725]: 2026-01-22 23:00:49.740 182729 DEBUG oslo_concurrency.lockutils [req-d066fd7d-a047-48c5-bb6f-05f19b5c607d req-643a6bfe-fcf0-4773-b7a1-5a93b64ac8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 23:00:50 compute-0 podman[241141]: 2026-01-22 23:00:50.120667185 +0000 UTC m=+0.049294026 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:00:50 compute-0 podman[241139]: 2026-01-22 23:00:50.124670814 +0000 UTC m=+0.057028198 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 23:00:50 compute-0 podman[241140]: 2026-01-22 23:00:50.149529312 +0000 UTC m=+0.080445230 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 23:00:50 compute-0 ovn_controller[94850]: 2026-01-22T23:00:50Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:bc:e1 10.100.0.8
Jan 22 23:00:50 compute-0 ovn_controller[94850]: 2026-01-22T23:00:50Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:bc:e1 10.100.0.8
Jan 22 23:00:50 compute-0 nova_compute[182725]: 2026-01-22 23:00:50.392 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:50 compute-0 nova_compute[182725]: 2026-01-22 23:00:50.393 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:50 compute-0 nova_compute[182725]: 2026-01-22 23:00:50.879 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:50 compute-0 nova_compute[182725]: 2026-01-22 23:00:50.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:50 compute-0 nova_compute[182725]: 2026-01-22 23:00:50.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:00:51 compute-0 nova_compute[182725]: 2026-01-22 23:00:51.771 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:54 compute-0 nova_compute[182725]: 2026-01-22 23:00:54.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:00:55 compute-0 nova_compute[182725]: 2026-01-22 23:00:55.880 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:56 compute-0 nova_compute[182725]: 2026-01-22 23:00:56.775 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:00:56 compute-0 nova_compute[182725]: 2026-01-22 23:00:56.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:00 compute-0 nova_compute[182725]: 2026-01-22 23:01:00.883 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:01 compute-0 CROND[241208]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 23:01:01 compute-0 run-parts[241211]: (/etc/cron.hourly) starting 0anacron
Jan 22 23:01:01 compute-0 run-parts[241217]: (/etc/cron.hourly) finished 0anacron
Jan 22 23:01:01 compute-0 CROND[241207]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 23:01:01 compute-0 nova_compute[182725]: 2026-01-22 23:01:01.779 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:02 compute-0 nova_compute[182725]: 2026-01-22 23:01:02.907 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "49a28248-68bf-4f1c-a38f-01483bb35cf8" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:01:02 compute-0 nova_compute[182725]: 2026-01-22 23:01:02.907 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:01:02 compute-0 nova_compute[182725]: 2026-01-22 23:01:02.908 182729 INFO nova.compute.manager [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Shelving
Jan 22 23:01:02 compute-0 nova_compute[182725]: 2026-01-22 23:01:02.950 182729 DEBUG nova.virt.libvirt.driver [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 23:01:03 compute-0 podman[241218]: 2026-01-22 23:01:03.135243787 +0000 UTC m=+0.068391901 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 23:01:05 compute-0 kernel: tap74821d1a-2c (unregistering): left promiscuous mode
Jan 22 23:01:05 compute-0 NetworkManager[54954]: <info>  [1769122865.1191] device (tap74821d1a-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 23:01:05 compute-0 ovn_controller[94850]: 2026-01-22T23:01:05Z|00768|binding|INFO|Releasing lport 74821d1a-2c78-4cec-90ae-32a1aa1c77bc from this chassis (sb_readonly=0)
Jan 22 23:01:05 compute-0 ovn_controller[94850]: 2026-01-22T23:01:05Z|00769|binding|INFO|Setting lport 74821d1a-2c78-4cec-90ae-32a1aa1c77bc down in Southbound
Jan 22 23:01:05 compute-0 ovn_controller[94850]: 2026-01-22T23:01:05Z|00770|binding|INFO|Removing iface tap74821d1a-2c ovn-installed in OVS
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.131 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.133 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.134 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.142 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:bc:e1 10.100.0.8'], port_security=['fa:16:3e:bc:bc:e1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '49a28248-68bf-4f1c-a38f-01483bb35cf8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61dce738-581f-4a17-826d-74026fc4bd1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63a8d5918d0a40388e1827e68bbd7188', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f49bec43-bd9f-46cd-9803-de9f64df03d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e7715dc-149e-449b-b509-948ea347b840, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa026186850>], logical_port=74821d1a-2c78-4cec-90ae-32a1aa1c77bc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa026186850>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.145 104215 INFO neutron.agent.ovn.metadata.agent [-] Port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc in datapath 61dce738-581f-4a17-826d-74026fc4bd1e unbound from our chassis
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.148 104215 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61dce738-581f-4a17-826d-74026fc4bd1e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.150 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0abbcc-fbf3-4aa6-a097-5e3dc0272c1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.151 104215 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e namespace which is not needed anymore
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.165 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Jan 22 23:01:05 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000bc.scope: Consumed 13.116s CPU time.
Jan 22 23:01:05 compute-0 systemd-machined[154006]: Machine qemu-80-instance-000000bc terminated.
Jan 22 23:01:05 compute-0 neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e[241103]: [NOTICE]   (241107) : haproxy version is 2.8.14-c23fe91
Jan 22 23:01:05 compute-0 neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e[241103]: [NOTICE]   (241107) : path to executable is /usr/sbin/haproxy
Jan 22 23:01:05 compute-0 neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e[241103]: [WARNING]  (241107) : Exiting Master process...
Jan 22 23:01:05 compute-0 neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e[241103]: [ALERT]    (241107) : Current worker (241109) exited with code 143 (Terminated)
Jan 22 23:01:05 compute-0 neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e[241103]: [WARNING]  (241107) : All workers exited. Exiting... (0)
Jan 22 23:01:05 compute-0 systemd[1]: libpod-5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84.scope: Deactivated successfully.
Jan 22 23:01:05 compute-0 podman[241261]: 2026-01-22 23:01:05.286867605 +0000 UTC m=+0.046547938 container died 5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 23:01:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84-userdata-shm.mount: Deactivated successfully.
Jan 22 23:01:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-87722afd9bcde705aa44f0599dc473c6719134827529520f005123d993b8309d-merged.mount: Deactivated successfully.
Jan 22 23:01:05 compute-0 podman[241261]: 2026-01-22 23:01:05.323813303 +0000 UTC m=+0.083493656 container cleanup 5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 23:01:05 compute-0 systemd[1]: libpod-conmon-5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84.scope: Deactivated successfully.
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.376 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.383 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 podman[241293]: 2026-01-22 23:01:05.395633388 +0000 UTC m=+0.051975812 container remove 5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.403 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[21d6b5d5-8b4a-489d-8f50-0a44ccdf31fb]: (4, ('Thu Jan 22 11:01:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e (5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84)\n5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84\nThu Jan 22 11:01:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e (5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84)\n5b89d15e2fa00ee6f3c40fd7ca664e2e0cb1c11139d56dd2c9eac6971032ca84\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.404 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[661cfecc-d8ae-4295-acfe-4a2c8dea8b4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.405 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61dce738-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.407 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 kernel: tap61dce738-50: left promiscuous mode
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.427 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.429 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[57018e38-a9d7-4e98-a7da-e616e4e4ba88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.444 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[13994249-ad12-4c21-9e14-2fedf44706ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.445 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[8866a81b-90b6-4c66-bd55-5c4f071d0585]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.460 211671 DEBUG oslo.privsep.daemon [-] privsep: reply[025c4efc-9af1-4c34-8e7d-8c084958ad9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647563, 'reachable_time': 28465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241327, 'error': None, 'target': 'ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.462 104606 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61dce738-581f-4a17-826d-74026fc4bd1e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 23:01:05 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:05.462 104606 DEBUG oslo.privsep.daemon [-] privsep: reply[33d851a8-83d1-49f4-b503-6edf39acb2ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 23:01:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d61dce738\x2d581f\x2d4a17\x2d826d\x2d74026fc4bd1e.mount: Deactivated successfully.
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.685 182729 DEBUG nova.compute.manager [req-790e2dcf-f2a5-49f5-8217-94551607d397 req-14a777d1-8767-4bf9-aa63-5607394c3830 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received event network-vif-unplugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.685 182729 DEBUG oslo_concurrency.lockutils [req-790e2dcf-f2a5-49f5-8217-94551607d397 req-14a777d1-8767-4bf9-aa63-5607394c3830 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.685 182729 DEBUG oslo_concurrency.lockutils [req-790e2dcf-f2a5-49f5-8217-94551607d397 req-14a777d1-8767-4bf9-aa63-5607394c3830 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.685 182729 DEBUG oslo_concurrency.lockutils [req-790e2dcf-f2a5-49f5-8217-94551607d397 req-14a777d1-8767-4bf9-aa63-5607394c3830 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.685 182729 DEBUG nova.compute.manager [req-790e2dcf-f2a5-49f5-8217-94551607d397 req-14a777d1-8767-4bf9-aa63-5607394c3830 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] No waiting events found dispatching network-vif-unplugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.686 182729 WARNING nova.compute.manager [req-790e2dcf-f2a5-49f5-8217-94551607d397 req-14a777d1-8767-4bf9-aa63-5607394c3830 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received unexpected event network-vif-unplugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc for instance with vm_state active and task_state shelving.
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.885 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.964 182729 INFO nova.virt.libvirt.driver [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance shutdown successfully after 3 seconds.
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.970 182729 INFO nova.virt.libvirt.driver [-] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance destroyed successfully.
Jan 22 23:01:05 compute-0 nova_compute[182725]: 2026-01-22 23:01:05.970 182729 DEBUG nova.objects.instance [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lazy-loading 'numa_topology' on Instance uuid 49a28248-68bf-4f1c-a38f-01483bb35cf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 23:01:06 compute-0 nova_compute[182725]: 2026-01-22 23:01:06.234 182729 INFO nova.virt.libvirt.driver [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Beginning cold snapshot process
Jan 22 23:01:06 compute-0 nova_compute[182725]: 2026-01-22 23:01:06.409 182729 DEBUG nova.privsep.utils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 23:01:06 compute-0 nova_compute[182725]: 2026-01-22 23:01:06.409 182729 DEBUG oslo_concurrency.processutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk /var/lib/nova/instances/snapshots/tmpk3i9d6jv/194d6eb997dd4255b18805bdf38659e5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:01:06 compute-0 nova_compute[182725]: 2026-01-22 23:01:06.689 182729 DEBUG oslo_concurrency.processutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8/disk /var/lib/nova/instances/snapshots/tmpk3i9d6jv/194d6eb997dd4255b18805bdf38659e5" returned: 0 in 0.280s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:01:06 compute-0 nova_compute[182725]: 2026-01-22 23:01:06.690 182729 INFO nova.virt.libvirt.driver [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Snapshot extracted, beginning image upload
Jan 22 23:01:06 compute-0 nova_compute[182725]: 2026-01-22 23:01:06.787 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:07 compute-0 nova_compute[182725]: 2026-01-22 23:01:07.797 182729 DEBUG nova.compute.manager [req-50e34ecd-cd45-4b70-95c8-6c219a21dad6 req-0ee6910a-5099-41d5-aa42-c8159540878d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received event network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 23:01:07 compute-0 nova_compute[182725]: 2026-01-22 23:01:07.798 182729 DEBUG oslo_concurrency.lockutils [req-50e34ecd-cd45-4b70-95c8-6c219a21dad6 req-0ee6910a-5099-41d5-aa42-c8159540878d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:01:07 compute-0 nova_compute[182725]: 2026-01-22 23:01:07.798 182729 DEBUG oslo_concurrency.lockutils [req-50e34ecd-cd45-4b70-95c8-6c219a21dad6 req-0ee6910a-5099-41d5-aa42-c8159540878d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:01:07 compute-0 nova_compute[182725]: 2026-01-22 23:01:07.798 182729 DEBUG oslo_concurrency.lockutils [req-50e34ecd-cd45-4b70-95c8-6c219a21dad6 req-0ee6910a-5099-41d5-aa42-c8159540878d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:01:07 compute-0 nova_compute[182725]: 2026-01-22 23:01:07.799 182729 DEBUG nova.compute.manager [req-50e34ecd-cd45-4b70-95c8-6c219a21dad6 req-0ee6910a-5099-41d5-aa42-c8159540878d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] No waiting events found dispatching network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 23:01:07 compute-0 nova_compute[182725]: 2026-01-22 23:01:07.799 182729 WARNING nova.compute.manager [req-50e34ecd-cd45-4b70-95c8-6c219a21dad6 req-0ee6910a-5099-41d5-aa42-c8159540878d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received unexpected event network-vif-plugged-74821d1a-2c78-4cec-90ae-32a1aa1c77bc for instance with vm_state active and task_state shelving_image_uploading.
Jan 22 23:01:08 compute-0 podman[241339]: 2026-01-22 23:01:08.153260167 +0000 UTC m=+0.078827240 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=)
Jan 22 23:01:08 compute-0 podman[241338]: 2026-01-22 23:01:08.224529058 +0000 UTC m=+0.162955851 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 23:01:08 compute-0 nova_compute[182725]: 2026-01-22 23:01:08.982 182729 INFO nova.virt.libvirt.driver [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Snapshot image upload complete
Jan 22 23:01:08 compute-0 nova_compute[182725]: 2026-01-22 23:01:08.983 182729 DEBUG nova.compute.manager [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:01:09 compute-0 nova_compute[182725]: 2026-01-22 23:01:09.089 182729 INFO nova.compute.manager [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Shelve offloading
Jan 22 23:01:09 compute-0 nova_compute[182725]: 2026-01-22 23:01:09.110 182729 INFO nova.virt.libvirt.driver [-] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance destroyed successfully.
Jan 22 23:01:09 compute-0 nova_compute[182725]: 2026-01-22 23:01:09.111 182729 DEBUG nova.compute.manager [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:01:09 compute-0 nova_compute[182725]: 2026-01-22 23:01:09.114 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 23:01:09 compute-0 nova_compute[182725]: 2026-01-22 23:01:09.115 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquired lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 23:01:09 compute-0 nova_compute[182725]: 2026-01-22 23:01:09.116 182729 DEBUG nova.network.neutron [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 23:01:10 compute-0 nova_compute[182725]: 2026-01-22 23:01:10.887 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:11 compute-0 nova_compute[182725]: 2026-01-22 23:01:11.474 182729 DEBUG nova.network.neutron [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updating instance_info_cache with network_info: [{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 23:01:11 compute-0 nova_compute[182725]: 2026-01-22 23:01:11.496 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Releasing lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 23:01:11 compute-0 nova_compute[182725]: 2026-01-22 23:01:11.789 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:12.469 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:12.470 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:12.470 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:12.754 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.755 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:12.755 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.822 182729 INFO nova.virt.libvirt.driver [-] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Instance destroyed successfully.
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.822 182729 DEBUG nova.objects.instance [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lazy-loading 'resources' on Instance uuid 49a28248-68bf-4f1c-a38f-01483bb35cf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.840 182729 DEBUG nova.virt.libvirt.vif [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T23:00:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-893723156',display_name='tempest-TestShelveInstance-server-893723156',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-893723156',id=188,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC3HtHpV1FD2iesqzTdA4pNZO4ZemGeAj/PJq6gQTLKoxhM8z95o7fm8rT3rH9LD9Kyoq2hFFqCG8KrM/FAzOzMggGNvnKC1BVAxCZoE730yH6SjrGFZAOsvREtyXKBxKA==',key_name='tempest-TestShelveInstance-205565771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T23:00:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='63a8d5918d0a40388e1827e68bbd7188',ramdisk_id='',reservation_id='r-a589qtli',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1247982194',owner_user_name='tempest-TestShelveInstance-1247982194-project-member',shelved_at='2026-01-22T23:01:08.983566',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='9ef2b31a-bcd3-474e-bd1e-5c6d2604740c'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T23:01:06Z,user_data=None,user_id='d5c61ac7024e4bf483db50ec6fa503cd',uuid=49a28248-68bf-4f1c-a38f-01483bb35cf8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.841 182729 DEBUG nova.network.os_vif_util [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Converting VIF {"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": "br-int", "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74821d1a-2c", "ovs_interfaceid": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.842 182729 DEBUG nova.network.os_vif_util [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:bc:e1,bridge_name='br-int',has_traffic_filtering=True,id=74821d1a-2c78-4cec-90ae-32a1aa1c77bc,network=Network(61dce738-581f-4a17-826d-74026fc4bd1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74821d1a-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.843 182729 DEBUG os_vif [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:bc:e1,bridge_name='br-int',has_traffic_filtering=True,id=74821d1a-2c78-4cec-90ae-32a1aa1c77bc,network=Network(61dce738-581f-4a17-826d-74026fc4bd1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74821d1a-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.845 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.845 182729 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74821d1a-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.847 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.849 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.852 182729 INFO os_vif [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:bc:e1,bridge_name='br-int',has_traffic_filtering=True,id=74821d1a-2c78-4cec-90ae-32a1aa1c77bc,network=Network(61dce738-581f-4a17-826d-74026fc4bd1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74821d1a-2c')
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.853 182729 INFO nova.virt.libvirt.driver [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Deleting instance files /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8_del
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.863 182729 INFO nova.virt.libvirt.driver [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Deletion of /var/lib/nova/instances/49a28248-68bf-4f1c-a38f-01483bb35cf8_del complete
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.903 182729 DEBUG nova.compute.manager [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Received event network-changed-74821d1a-2c78-4cec-90ae-32a1aa1c77bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.903 182729 DEBUG nova.compute.manager [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Refreshing instance network info cache due to event network-changed-74821d1a-2c78-4cec-90ae-32a1aa1c77bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.903 182729 DEBUG oslo_concurrency.lockutils [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.903 182729 DEBUG oslo_concurrency.lockutils [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.904 182729 DEBUG nova.network.neutron [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Refreshing network info cache for port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 23:01:12 compute-0 nova_compute[182725]: 2026-01-22 23:01:12.973 182729 INFO nova.scheduler.client.report [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Deleted allocations for instance 49a28248-68bf-4f1c-a38f-01483bb35cf8
Jan 22 23:01:13 compute-0 nova_compute[182725]: 2026-01-22 23:01:13.025 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:01:13 compute-0 nova_compute[182725]: 2026-01-22 23:01:13.026 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:01:13 compute-0 nova_compute[182725]: 2026-01-22 23:01:13.051 182729 DEBUG nova.compute.provider_tree [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:01:13 compute-0 nova_compute[182725]: 2026-01-22 23:01:13.065 182729 DEBUG nova.scheduler.client.report [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:01:13 compute-0 nova_compute[182725]: 2026-01-22 23:01:13.079 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:01:13 compute-0 nova_compute[182725]: 2026-01-22 23:01:13.140 182729 DEBUG oslo_concurrency.lockutils [None req-31c98733-927c-4893-8495-f1c65161677c d5c61ac7024e4bf483db50ec6fa503cd 63a8d5918d0a40388e1827e68bbd7188 - - default default] Lock "49a28248-68bf-4f1c-a38f-01483bb35cf8" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 10.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:01:13 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:13.757 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:01:15 compute-0 nova_compute[182725]: 2026-01-22 23:01:15.890 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:16 compute-0 nova_compute[182725]: 2026-01-22 23:01:16.733 182729 DEBUG nova.network.neutron [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updated VIF entry in instance network info cache for port 74821d1a-2c78-4cec-90ae-32a1aa1c77bc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 23:01:16 compute-0 nova_compute[182725]: 2026-01-22 23:01:16.734 182729 DEBUG nova.network.neutron [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Updating instance_info_cache with network_info: [{"id": "74821d1a-2c78-4cec-90ae-32a1aa1c77bc", "address": "fa:16:3e:bc:bc:e1", "network": {"id": "61dce738-581f-4a17-826d-74026fc4bd1e", "bridge": null, "label": "tempest-TestShelveInstance-619158506-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63a8d5918d0a40388e1827e68bbd7188", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap74821d1a-2c", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 23:01:16 compute-0 nova_compute[182725]: 2026-01-22 23:01:16.750 182729 DEBUG oslo_concurrency.lockutils [req-a8c67945-d2ce-4679-981e-cb60afc7b98f req-02ce2cfa-4a27-4766-8c5b-66f65fd6ffd1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-49a28248-68bf-4f1c-a38f-01483bb35cf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 23:01:17 compute-0 nova_compute[182725]: 2026-01-22 23:01:17.825 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:17 compute-0 nova_compute[182725]: 2026-01-22 23:01:17.848 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:20 compute-0 nova_compute[182725]: 2026-01-22 23:01:20.418 182729 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122865.412922, 49a28248-68bf-4f1c-a38f-01483bb35cf8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 23:01:20 compute-0 nova_compute[182725]: 2026-01-22 23:01:20.418 182729 INFO nova.compute.manager [-] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] VM Stopped (Lifecycle Event)
Jan 22 23:01:20 compute-0 nova_compute[182725]: 2026-01-22 23:01:20.464 182729 DEBUG nova.compute.manager [None req-606776b8-50da-4a9e-be5e-0c3168e7eaca - - - - - -] [instance: 49a28248-68bf-4f1c-a38f-01483bb35cf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 23:01:20 compute-0 nova_compute[182725]: 2026-01-22 23:01:20.891 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:21 compute-0 podman[241389]: 2026-01-22 23:01:21.156588137 +0000 UTC m=+0.068675207 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:01:21 compute-0 podman[241388]: 2026-01-22 23:01:21.174555344 +0000 UTC m=+0.097993807 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 23:01:21 compute-0 podman[241387]: 2026-01-22 23:01:21.193149996 +0000 UTC m=+0.114275391 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:01:22 compute-0 nova_compute[182725]: 2026-01-22 23:01:22.850 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:25 compute-0 nova_compute[182725]: 2026-01-22 23:01:25.894 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:27 compute-0 nova_compute[182725]: 2026-01-22 23:01:27.852 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:30 compute-0 nova_compute[182725]: 2026-01-22 23:01:30.895 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:32 compute-0 nova_compute[182725]: 2026-01-22 23:01:32.854 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:34 compute-0 podman[241453]: 2026-01-22 23:01:34.160583046 +0000 UTC m=+0.093838953 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 23:01:35 compute-0 nova_compute[182725]: 2026-01-22 23:01:35.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:37 compute-0 nova_compute[182725]: 2026-01-22 23:01:37.856 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:39 compute-0 podman[241474]: 2026-01-22 23:01:39.129066054 +0000 UTC m=+0.060847923 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_id=openstack_network_exporter)
Jan 22 23:01:39 compute-0 podman[241473]: 2026-01-22 23:01:39.148662801 +0000 UTC m=+0.082300526 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 23:01:40 compute-0 nova_compute[182725]: 2026-01-22 23:01:40.898 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:40 compute-0 nova_compute[182725]: 2026-01-22 23:01:40.936 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:42 compute-0 ovn_controller[94850]: 2026-01-22T23:01:42Z|00771|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 22 23:01:42 compute-0 nova_compute[182725]: 2026-01-22 23:01:42.858 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:45 compute-0 nova_compute[182725]: 2026-01-22 23:01:45.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:45 compute-0 nova_compute[182725]: 2026-01-22 23:01:45.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:01:45 compute-0 nova_compute[182725]: 2026-01-22 23:01:45.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:01:45 compute-0 nova_compute[182725]: 2026-01-22 23:01:45.908 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:01:45 compute-0 nova_compute[182725]: 2026-01-22 23:01:45.938 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:47 compute-0 nova_compute[182725]: 2026-01-22 23:01:47.860 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:47 compute-0 nova_compute[182725]: 2026-01-22 23:01:47.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:48 compute-0 nova_compute[182725]: 2026-01-22 23:01:48.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:48 compute-0 nova_compute[182725]: 2026-01-22 23:01:48.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:01:48 compute-0 nova_compute[182725]: 2026-01-22 23:01:48.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:01:48 compute-0 nova_compute[182725]: 2026-01-22 23:01:48.912 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:01:48 compute-0 nova_compute[182725]: 2026-01-22 23:01:48.912 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.055 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.055 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5659MB free_disk=73.31620788574219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.056 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.056 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.105 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.106 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.118 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.146 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.147 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.159 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.187 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.216 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.231 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.256 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:01:49 compute-0 nova_compute[182725]: 2026-01-22 23:01:49.256 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:01:50 compute-0 nova_compute[182725]: 2026-01-22 23:01:50.257 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:50 compute-0 nova_compute[182725]: 2026-01-22 23:01:50.258 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:50 compute-0 nova_compute[182725]: 2026-01-22 23:01:50.940 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:51 compute-0 nova_compute[182725]: 2026-01-22 23:01:51.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:51 compute-0 nova_compute[182725]: 2026-01-22 23:01:51.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:01:52 compute-0 podman[241523]: 2026-01-22 23:01:52.152860624 +0000 UTC m=+0.077288892 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 23:01:52 compute-0 podman[241521]: 2026-01-22 23:01:52.164621586 +0000 UTC m=+0.092372247 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:01:52 compute-0 podman[241522]: 2026-01-22 23:01:52.177380283 +0000 UTC m=+0.099886784 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 23:01:52 compute-0 nova_compute[182725]: 2026-01-22 23:01:52.862 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:54 compute-0 nova_compute[182725]: 2026-01-22 23:01:54.689 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:54.689 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 23:01:54 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:01:54.691 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 23:01:55 compute-0 nova_compute[182725]: 2026-01-22 23:01:55.942 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:01:56 compute-0 nova_compute[182725]: 2026-01-22 23:01:56.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:56 compute-0 nova_compute[182725]: 2026-01-22 23:01:56.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:01:57 compute-0 nova_compute[182725]: 2026-01-22 23:01:57.864 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:00 compute-0 nova_compute[182725]: 2026-01-22 23:02:00.944 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:02 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:02:02.693 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:02:02 compute-0 nova_compute[182725]: 2026-01-22 23:02:02.867 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:05 compute-0 podman[241583]: 2026-01-22 23:02:05.127629134 +0000 UTC m=+0.063788986 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 23:02:05 compute-0 nova_compute[182725]: 2026-01-22 23:02:05.973 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:06 compute-0 nova_compute[182725]: 2026-01-22 23:02:06.363 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:06 compute-0 nova_compute[182725]: 2026-01-22 23:02:06.466 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:07 compute-0 nova_compute[182725]: 2026-01-22 23:02:07.870 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:02:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:02:09 compute-0 nova_compute[182725]: 2026-01-22 23:02:09.885 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:10 compute-0 podman[241605]: 2026-01-22 23:02:10.174772441 +0000 UTC m=+0.092848596 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 23:02:10 compute-0 podman[241604]: 2026-01-22 23:02:10.231739257 +0000 UTC m=+0.155415370 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 23:02:10 compute-0 nova_compute[182725]: 2026-01-22 23:02:10.974 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:02:12.471 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:02:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:02:12.472 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:02:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:02:12.472 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:02:12 compute-0 nova_compute[182725]: 2026-01-22 23:02:12.872 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:15 compute-0 nova_compute[182725]: 2026-01-22 23:02:15.976 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:17 compute-0 nova_compute[182725]: 2026-01-22 23:02:17.874 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:20 compute-0 nova_compute[182725]: 2026-01-22 23:02:20.979 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:22 compute-0 nova_compute[182725]: 2026-01-22 23:02:22.876 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:23 compute-0 podman[241653]: 2026-01-22 23:02:23.149037354 +0000 UTC m=+0.077312061 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 23:02:23 compute-0 podman[241654]: 2026-01-22 23:02:23.150577382 +0000 UTC m=+0.075860905 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 23:02:23 compute-0 podman[241652]: 2026-01-22 23:02:23.170594027 +0000 UTC m=+0.099189172 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:02:25 compute-0 nova_compute[182725]: 2026-01-22 23:02:25.986 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:27 compute-0 nova_compute[182725]: 2026-01-22 23:02:27.879 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:30 compute-0 nova_compute[182725]: 2026-01-22 23:02:30.991 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:32 compute-0 nova_compute[182725]: 2026-01-22 23:02:32.881 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:35 compute-0 nova_compute[182725]: 2026-01-22 23:02:35.991 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:36 compute-0 podman[241716]: 2026-01-22 23:02:36.12949376 +0000 UTC m=+0.067906270 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 23:02:37 compute-0 nova_compute[182725]: 2026-01-22 23:02:37.883 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:40 compute-0 nova_compute[182725]: 2026-01-22 23:02:40.902 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:40 compute-0 nova_compute[182725]: 2026-01-22 23:02:40.994 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:41 compute-0 podman[241735]: 2026-01-22 23:02:41.175454897 +0000 UTC m=+0.098215788 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, config_id=openstack_network_exporter)
Jan 22 23:02:41 compute-0 podman[241734]: 2026-01-22 23:02:41.196440006 +0000 UTC m=+0.124543059 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 23:02:42 compute-0 ovn_controller[94850]: 2026-01-22T23:02:42Z|00772|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 23:02:42 compute-0 nova_compute[182725]: 2026-01-22 23:02:42.885 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:45 compute-0 nova_compute[182725]: 2026-01-22 23:02:45.995 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:47 compute-0 nova_compute[182725]: 2026-01-22 23:02:47.888 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:47 compute-0 nova_compute[182725]: 2026-01-22 23:02:47.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:47 compute-0 nova_compute[182725]: 2026-01-22 23:02:47.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:02:47 compute-0 nova_compute[182725]: 2026-01-22 23:02:47.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:02:47 compute-0 nova_compute[182725]: 2026-01-22 23:02:47.911 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:02:48 compute-0 nova_compute[182725]: 2026-01-22 23:02:48.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:49 compute-0 nova_compute[182725]: 2026-01-22 23:02:49.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:50 compute-0 nova_compute[182725]: 2026-01-22 23:02:50.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:50 compute-0 nova_compute[182725]: 2026-01-22 23:02:50.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:02:50 compute-0 nova_compute[182725]: 2026-01-22 23:02:50.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:02:50 compute-0 nova_compute[182725]: 2026-01-22 23:02:50.916 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:02:50 compute-0 nova_compute[182725]: 2026-01-22 23:02:50.916 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:02:50 compute-0 nova_compute[182725]: 2026-01-22 23:02:50.998 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.088 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.089 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5687MB free_disk=73.31660461425781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.089 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.089 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.155 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.156 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.178 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.194 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.195 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:02:51 compute-0 nova_compute[182725]: 2026-01-22 23:02:51.196 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:02:52 compute-0 nova_compute[182725]: 2026-01-22 23:02:52.196 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:52 compute-0 nova_compute[182725]: 2026-01-22 23:02:52.197 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:52 compute-0 nova_compute[182725]: 2026-01-22 23:02:52.198 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:02:52 compute-0 nova_compute[182725]: 2026-01-22 23:02:52.890 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:54 compute-0 podman[241778]: 2026-01-22 23:02:54.118689947 +0000 UTC m=+0.054193491 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 23:02:54 compute-0 podman[241780]: 2026-01-22 23:02:54.123030324 +0000 UTC m=+0.054229911 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 23:02:54 compute-0 podman[241779]: 2026-01-22 23:02:54.152169164 +0000 UTC m=+0.081665279 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 23:02:56 compute-0 nova_compute[182725]: 2026-01-22 23:02:56.000 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:02:56 compute-0 nova_compute[182725]: 2026-01-22 23:02:56.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:57 compute-0 nova_compute[182725]: 2026-01-22 23:02:57.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:02:57 compute-0 nova_compute[182725]: 2026-01-22 23:02:57.939 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:01 compute-0 nova_compute[182725]: 2026-01-22 23:03:01.002 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:02 compute-0 nova_compute[182725]: 2026-01-22 23:03:02.940 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:06 compute-0 nova_compute[182725]: 2026-01-22 23:03:06.004 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:07 compute-0 podman[241847]: 2026-01-22 23:03:07.17292887 +0000 UTC m=+0.096102106 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 23:03:07 compute-0 nova_compute[182725]: 2026-01-22 23:03:07.942 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:11 compute-0 nova_compute[182725]: 2026-01-22 23:03:11.005 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:12 compute-0 sshd-session[241868]: Accepted publickey for zuul from 192.168.122.10 port 35610 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 23:03:12 compute-0 systemd-logind[801]: New session 47 of user zuul.
Jan 22 23:03:12 compute-0 systemd[1]: Started Session 47 of User zuul.
Jan 22 23:03:12 compute-0 sshd-session[241868]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 23:03:12 compute-0 podman[241871]: 2026-01-22 23:03:12.196922054 +0000 UTC m=+0.115659210 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter)
Jan 22 23:03:12 compute-0 podman[241870]: 2026-01-22 23:03:12.22995099 +0000 UTC m=+0.154532200 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 23:03:12 compute-0 sudo[241920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 22 23:03:12 compute-0 sudo[241920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 23:03:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:03:12.473 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:03:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:03:12.474 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:03:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:03:12.474 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:03:12 compute-0 nova_compute[182725]: 2026-01-22 23:03:12.943 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:16 compute-0 nova_compute[182725]: 2026-01-22 23:03:16.007 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:17 compute-0 ovs-vsctl[242092]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 23:03:17 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 241945 (sos)
Jan 22 23:03:17 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 22 23:03:17 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 22 23:03:17 compute-0 nova_compute[182725]: 2026-01-22 23:03:17.945 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:18 compute-0 virtqemud[182297]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 23:03:18 compute-0 virtqemud[182297]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 23:03:18 compute-0 virtqemud[182297]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 23:03:19 compute-0 crontab[242493]: (root) LIST (root)
Jan 22 23:03:21 compute-0 nova_compute[182725]: 2026-01-22 23:03:21.008 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:21 compute-0 systemd[1]: Starting Hostname Service...
Jan 22 23:03:21 compute-0 systemd[1]: Started Hostname Service.
Jan 22 23:03:22 compute-0 nova_compute[182725]: 2026-01-22 23:03:22.946 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:25 compute-0 podman[242991]: 2026-01-22 23:03:25.142187894 +0000 UTC m=+0.070003441 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:03:25 compute-0 podman[242995]: 2026-01-22 23:03:25.14365173 +0000 UTC m=+0.070932164 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 23:03:25 compute-0 podman[242994]: 2026-01-22 23:03:25.152721454 +0000 UTC m=+0.072880372 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 23:03:26 compute-0 nova_compute[182725]: 2026-01-22 23:03:26.010 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:27 compute-0 nova_compute[182725]: 2026-01-22 23:03:27.948 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:29 compute-0 ovs-appctl[244030]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 23:03:29 compute-0 ovs-appctl[244040]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 23:03:29 compute-0 ovs-appctl[244047]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 23:03:31 compute-0 nova_compute[182725]: 2026-01-22 23:03:31.033 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:32 compute-0 nova_compute[182725]: 2026-01-22 23:03:32.952 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:35 compute-0 virtqemud[182297]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 23:03:36 compute-0 nova_compute[182725]: 2026-01-22 23:03:36.034 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:37 compute-0 podman[245308]: 2026-01-22 23:03:37.310164554 +0000 UTC m=+0.094758343 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 23:03:37 compute-0 nova_compute[182725]: 2026-01-22 23:03:37.955 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:38 compute-0 systemd[1]: Starting Time & Date Service...
Jan 22 23:03:38 compute-0 systemd[1]: Started Time & Date Service.
Jan 22 23:03:41 compute-0 nova_compute[182725]: 2026-01-22 23:03:41.034 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:42 compute-0 nova_compute[182725]: 2026-01-22 23:03:42.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:42 compute-0 nova_compute[182725]: 2026-01-22 23:03:42.957 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:43 compute-0 podman[245383]: 2026-01-22 23:03:43.131395391 +0000 UTC m=+0.064290610 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 23:03:43 compute-0 podman[245382]: 2026-01-22 23:03:43.158717626 +0000 UTC m=+0.094461725 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 23:03:46 compute-0 nova_compute[182725]: 2026-01-22 23:03:46.036 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:47 compute-0 nova_compute[182725]: 2026-01-22 23:03:47.959 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:48 compute-0 nova_compute[182725]: 2026-01-22 23:03:48.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:48 compute-0 nova_compute[182725]: 2026-01-22 23:03:48.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:03:48 compute-0 nova_compute[182725]: 2026-01-22 23:03:48.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:03:48 compute-0 nova_compute[182725]: 2026-01-22 23:03:48.941 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:03:50 compute-0 nova_compute[182725]: 2026-01-22 23:03:50.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:50 compute-0 nova_compute[182725]: 2026-01-22 23:03:50.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:51 compute-0 nova_compute[182725]: 2026-01-22 23:03:51.038 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:52 compute-0 nova_compute[182725]: 2026-01-22 23:03:52.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:52 compute-0 nova_compute[182725]: 2026-01-22 23:03:52.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:52 compute-0 nova_compute[182725]: 2026-01-22 23:03:52.912 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:03:52 compute-0 nova_compute[182725]: 2026-01-22 23:03:52.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:03:52 compute-0 nova_compute[182725]: 2026-01-22 23:03:52.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:03:52 compute-0 nova_compute[182725]: 2026-01-22 23:03:52.913 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:03:52 compute-0 nova_compute[182725]: 2026-01-22 23:03:52.961 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.049 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.050 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5395MB free_disk=72.97147369384766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.050 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.050 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.128 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.128 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.159 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.172 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.197 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:03:53 compute-0 nova_compute[182725]: 2026-01-22 23:03:53.198 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:03:54 compute-0 nova_compute[182725]: 2026-01-22 23:03:54.197 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:54 compute-0 nova_compute[182725]: 2026-01-22 23:03:54.198 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:03:56 compute-0 nova_compute[182725]: 2026-01-22 23:03:56.041 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:56 compute-0 podman[245431]: 2026-01-22 23:03:56.137617815 +0000 UTC m=+0.060367053 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:03:56 compute-0 podman[245430]: 2026-01-22 23:03:56.152843312 +0000 UTC m=+0.082556542 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 23:03:56 compute-0 podman[245429]: 2026-01-22 23:03:56.156566113 +0000 UTC m=+0.085330119 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:03:57 compute-0 nova_compute[182725]: 2026-01-22 23:03:57.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:03:57 compute-0 nova_compute[182725]: 2026-01-22 23:03:57.963 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:03:58 compute-0 nova_compute[182725]: 2026-01-22 23:03:58.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:01 compute-0 nova_compute[182725]: 2026-01-22 23:04:01.041 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:02 compute-0 nova_compute[182725]: 2026-01-22 23:04:02.964 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:03 compute-0 sudo[241920]: pam_unix(sudo:session): session closed for user root
Jan 22 23:04:03 compute-0 sshd-session[241908]: Received disconnect from 192.168.122.10 port 35610:11: disconnected by user
Jan 22 23:04:03 compute-0 sshd-session[241908]: Disconnected from user zuul 192.168.122.10 port 35610
Jan 22 23:04:03 compute-0 sshd-session[241868]: pam_unix(sshd:session): session closed for user zuul
Jan 22 23:04:03 compute-0 systemd[1]: session-47.scope: Deactivated successfully.
Jan 22 23:04:03 compute-0 systemd[1]: session-47.scope: Consumed 1min 22.850s CPU time, 583.3M memory peak, read 102.8M from disk, written 18.3M to disk.
Jan 22 23:04:03 compute-0 systemd-logind[801]: Session 47 logged out. Waiting for processes to exit.
Jan 22 23:04:03 compute-0 systemd-logind[801]: Removed session 47.
Jan 22 23:04:03 compute-0 sshd-session[245490]: Accepted publickey for zuul from 192.168.122.10 port 54894 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 23:04:03 compute-0 systemd-logind[801]: New session 48 of user zuul.
Jan 22 23:04:03 compute-0 systemd[1]: Started Session 48 of User zuul.
Jan 22 23:04:03 compute-0 sshd-session[245490]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 23:04:03 compute-0 sudo[245494]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-01-22-nlmnowa.tar.xz
Jan 22 23:04:03 compute-0 sudo[245494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 23:04:03 compute-0 sudo[245494]: pam_unix(sudo:session): session closed for user root
Jan 22 23:04:03 compute-0 sshd-session[245493]: Received disconnect from 192.168.122.10 port 54894:11: disconnected by user
Jan 22 23:04:03 compute-0 sshd-session[245493]: Disconnected from user zuul 192.168.122.10 port 54894
Jan 22 23:04:03 compute-0 sshd-session[245490]: pam_unix(sshd:session): session closed for user zuul
Jan 22 23:04:03 compute-0 systemd[1]: session-48.scope: Deactivated successfully.
Jan 22 23:04:03 compute-0 systemd-logind[801]: Session 48 logged out. Waiting for processes to exit.
Jan 22 23:04:03 compute-0 systemd-logind[801]: Removed session 48.
Jan 22 23:04:03 compute-0 sshd-session[245519]: Accepted publickey for zuul from 192.168.122.10 port 54896 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 23:04:03 compute-0 systemd-logind[801]: New session 49 of user zuul.
Jan 22 23:04:03 compute-0 systemd[1]: Started Session 49 of User zuul.
Jan 22 23:04:03 compute-0 sshd-session[245519]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 23:04:03 compute-0 sudo[245523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 22 23:04:03 compute-0 sudo[245523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 23:04:03 compute-0 sudo[245523]: pam_unix(sudo:session): session closed for user root
Jan 22 23:04:03 compute-0 sshd-session[245522]: Received disconnect from 192.168.122.10 port 54896:11: disconnected by user
Jan 22 23:04:03 compute-0 sshd-session[245522]: Disconnected from user zuul 192.168.122.10 port 54896
Jan 22 23:04:03 compute-0 sshd-session[245519]: pam_unix(sshd:session): session closed for user zuul
Jan 22 23:04:03 compute-0 systemd[1]: session-49.scope: Deactivated successfully.
Jan 22 23:04:03 compute-0 systemd-logind[801]: Session 49 logged out. Waiting for processes to exit.
Jan 22 23:04:03 compute-0 systemd-logind[801]: Removed session 49.
Jan 22 23:04:06 compute-0 nova_compute[182725]: 2026-01-22 23:04:06.043 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:07 compute-0 nova_compute[182725]: 2026-01-22 23:04:07.966 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:08 compute-0 podman[245548]: 2026-01-22 23:04:08.148817225 +0000 UTC m=+0.077208949 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 23:04:08 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 23:04:08 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:04:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:04:10 compute-0 nova_compute[182725]: 2026-01-22 23:04:10.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:11 compute-0 nova_compute[182725]: 2026-01-22 23:04:11.044 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:04:12.475 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:04:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:04:12.475 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:04:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:04:12.475 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:04:12 compute-0 nova_compute[182725]: 2026-01-22 23:04:12.968 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:14 compute-0 podman[245574]: 2026-01-22 23:04:14.137241883 +0000 UTC m=+0.066463434 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Jan 22 23:04:14 compute-0 podman[245573]: 2026-01-22 23:04:14.148552032 +0000 UTC m=+0.082805387 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 23:04:16 compute-0 nova_compute[182725]: 2026-01-22 23:04:16.047 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:17 compute-0 nova_compute[182725]: 2026-01-22 23:04:17.970 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:21 compute-0 nova_compute[182725]: 2026-01-22 23:04:21.103 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:22 compute-0 nova_compute[182725]: 2026-01-22 23:04:22.973 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:26 compute-0 nova_compute[182725]: 2026-01-22 23:04:26.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:27 compute-0 podman[245620]: 2026-01-22 23:04:27.120730175 +0000 UTC m=+0.055984644 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 23:04:27 compute-0 podman[245621]: 2026-01-22 23:04:27.144892253 +0000 UTC m=+0.076763459 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 23:04:27 compute-0 podman[245622]: 2026-01-22 23:04:27.14762158 +0000 UTC m=+0.076072821 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:04:27 compute-0 nova_compute[182725]: 2026-01-22 23:04:27.975 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:31 compute-0 nova_compute[182725]: 2026-01-22 23:04:31.110 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:32 compute-0 nova_compute[182725]: 2026-01-22 23:04:32.978 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:36 compute-0 nova_compute[182725]: 2026-01-22 23:04:36.115 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:37 compute-0 nova_compute[182725]: 2026-01-22 23:04:37.981 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:39 compute-0 podman[245686]: 2026-01-22 23:04:39.144113073 +0000 UTC m=+0.076363749 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 23:04:41 compute-0 nova_compute[182725]: 2026-01-22 23:04:41.118 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:42 compute-0 nova_compute[182725]: 2026-01-22 23:04:42.902 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:42 compute-0 nova_compute[182725]: 2026-01-22 23:04:42.984 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:45 compute-0 podman[245709]: 2026-01-22 23:04:45.156828031 +0000 UTC m=+0.086003296 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=openstack_network_exporter, architecture=x86_64)
Jan 22 23:04:45 compute-0 podman[245708]: 2026-01-22 23:04:45.220688179 +0000 UTC m=+0.152771486 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 23:04:46 compute-0 nova_compute[182725]: 2026-01-22 23:04:46.119 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:47 compute-0 nova_compute[182725]: 2026-01-22 23:04:47.986 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:48 compute-0 nova_compute[182725]: 2026-01-22 23:04:48.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:48 compute-0 nova_compute[182725]: 2026-01-22 23:04:48.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:04:48 compute-0 nova_compute[182725]: 2026-01-22 23:04:48.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:04:48 compute-0 nova_compute[182725]: 2026-01-22 23:04:48.902 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:04:50 compute-0 nova_compute[182725]: 2026-01-22 23:04:50.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:50 compute-0 nova_compute[182725]: 2026-01-22 23:04:50.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:51 compute-0 nova_compute[182725]: 2026-01-22 23:04:51.121 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:52 compute-0 nova_compute[182725]: 2026-01-22 23:04:52.987 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:53 compute-0 nova_compute[182725]: 2026-01-22 23:04:53.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:53 compute-0 nova_compute[182725]: 2026-01-22 23:04:53.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:53 compute-0 nova_compute[182725]: 2026-01-22 23:04:53.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:04:53 compute-0 nova_compute[182725]: 2026-01-22 23:04:53.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:53 compute-0 nova_compute[182725]: 2026-01-22 23:04:53.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 23:04:54 compute-0 nova_compute[182725]: 2026-01-22 23:04:54.908 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:54 compute-0 nova_compute[182725]: 2026-01-22 23:04:54.950 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:04:54 compute-0 nova_compute[182725]: 2026-01-22 23:04:54.951 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:04:54 compute-0 nova_compute[182725]: 2026-01-22 23:04:54.951 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:04:54 compute-0 nova_compute[182725]: 2026-01-22 23:04:54.952 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.111 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.112 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5625MB free_disk=73.31542205810547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.113 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.113 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.315 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.316 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.512 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.551 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.584 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:04:55 compute-0 nova_compute[182725]: 2026-01-22 23:04:55.585 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:04:56 compute-0 nova_compute[182725]: 2026-01-22 23:04:56.190 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:57 compute-0 nova_compute[182725]: 2026-01-22 23:04:57.988 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:04:58 compute-0 podman[245755]: 2026-01-22 23:04:58.110022326 +0000 UTC m=+0.048679745 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:04:58 compute-0 podman[245757]: 2026-01-22 23:04:58.118089305 +0000 UTC m=+0.049585156 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:04:58 compute-0 podman[245756]: 2026-01-22 23:04:58.143223526 +0000 UTC m=+0.075556558 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 23:04:58 compute-0 nova_compute[182725]: 2026-01-22 23:04:58.565 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:04:59 compute-0 nova_compute[182725]: 2026-01-22 23:04:59.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:01 compute-0 nova_compute[182725]: 2026-01-22 23:05:01.192 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:02 compute-0 nova_compute[182725]: 2026-01-22 23:05:02.990 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:06 compute-0 nova_compute[182725]: 2026-01-22 23:05:06.195 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:07 compute-0 nova_compute[182725]: 2026-01-22 23:05:07.991 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:10 compute-0 podman[245819]: 2026-01-22 23:05:10.161369333 +0000 UTC m=+0.086531060 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 23:05:11 compute-0 nova_compute[182725]: 2026-01-22 23:05:11.196 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:05:12.475 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:05:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:05:12.476 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:05:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:05:12.476 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:05:13 compute-0 nova_compute[182725]: 2026-01-22 23:05:13.032 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:13 compute-0 nova_compute[182725]: 2026-01-22 23:05:13.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:16 compute-0 podman[245839]: 2026-01-22 23:05:16.145068615 +0000 UTC m=+0.084618421 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 23:05:16 compute-0 podman[245840]: 2026-01-22 23:05:16.157329058 +0000 UTC m=+0.091236184 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, vcs-type=git)
Jan 22 23:05:16 compute-0 nova_compute[182725]: 2026-01-22 23:05:16.197 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:18 compute-0 nova_compute[182725]: 2026-01-22 23:05:18.034 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:21 compute-0 nova_compute[182725]: 2026-01-22 23:05:21.201 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:23 compute-0 nova_compute[182725]: 2026-01-22 23:05:23.035 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:23 compute-0 nova_compute[182725]: 2026-01-22 23:05:23.905 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:23 compute-0 nova_compute[182725]: 2026-01-22 23:05:23.905 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 23:05:24 compute-0 nova_compute[182725]: 2026-01-22 23:05:24.314 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 23:05:26 compute-0 nova_compute[182725]: 2026-01-22 23:05:26.238 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:28 compute-0 nova_compute[182725]: 2026-01-22 23:05:28.037 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:29 compute-0 podman[245887]: 2026-01-22 23:05:29.109360173 +0000 UTC m=+0.044942022 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 23:05:29 compute-0 podman[245886]: 2026-01-22 23:05:29.126523397 +0000 UTC m=+0.056661401 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:05:29 compute-0 podman[245888]: 2026-01-22 23:05:29.138511073 +0000 UTC m=+0.073064276 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 23:05:31 compute-0 nova_compute[182725]: 2026-01-22 23:05:31.274 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:33 compute-0 nova_compute[182725]: 2026-01-22 23:05:33.038 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:36 compute-0 nova_compute[182725]: 2026-01-22 23:05:36.295 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:38 compute-0 nova_compute[182725]: 2026-01-22 23:05:38.040 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:41 compute-0 podman[245950]: 2026-01-22 23:05:41.121855011 +0000 UTC m=+0.056630501 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 22 23:05:41 compute-0 nova_compute[182725]: 2026-01-22 23:05:41.302 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:43 compute-0 nova_compute[182725]: 2026-01-22 23:05:43.061 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:44 compute-0 nova_compute[182725]: 2026-01-22 23:05:44.292 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:46 compute-0 nova_compute[182725]: 2026-01-22 23:05:46.303 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:47 compute-0 podman[245971]: 2026-01-22 23:05:47.135032581 +0000 UTC m=+0.076387678 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 23:05:47 compute-0 podman[245972]: 2026-01-22 23:05:47.152716848 +0000 UTC m=+0.087291238 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public)
Jan 22 23:05:48 compute-0 nova_compute[182725]: 2026-01-22 23:05:48.063 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:50 compute-0 nova_compute[182725]: 2026-01-22 23:05:50.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:50 compute-0 nova_compute[182725]: 2026-01-22 23:05:50.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:05:50 compute-0 nova_compute[182725]: 2026-01-22 23:05:50.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:05:50 compute-0 nova_compute[182725]: 2026-01-22 23:05:50.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:05:50 compute-0 nova_compute[182725]: 2026-01-22 23:05:50.904 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:50 compute-0 nova_compute[182725]: 2026-01-22 23:05:50.905 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:51 compute-0 nova_compute[182725]: 2026-01-22 23:05:51.304 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:53 compute-0 nova_compute[182725]: 2026-01-22 23:05:53.065 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:54 compute-0 nova_compute[182725]: 2026-01-22 23:05:54.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:55 compute-0 nova_compute[182725]: 2026-01-22 23:05:55.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:55 compute-0 nova_compute[182725]: 2026-01-22 23:05:55.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:05:56 compute-0 nova_compute[182725]: 2026-01-22 23:05:56.307 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:56 compute-0 nova_compute[182725]: 2026-01-22 23:05:56.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:05:56 compute-0 nova_compute[182725]: 2026-01-22 23:05:56.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:05:56 compute-0 nova_compute[182725]: 2026-01-22 23:05:56.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:05:56 compute-0 nova_compute[182725]: 2026-01-22 23:05:56.911 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:05:56 compute-0 nova_compute[182725]: 2026-01-22 23:05:56.912 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.118 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.120 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5625MB free_disk=73.31542205810547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.120 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.120 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.182 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.183 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.209 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.223 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.224 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:05:57 compute-0 nova_compute[182725]: 2026-01-22 23:05:57.225 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:05:58 compute-0 nova_compute[182725]: 2026-01-22 23:05:58.068 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:05:59 compute-0 nova_compute[182725]: 2026-01-22 23:05:59.225 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:00 compute-0 podman[246019]: 2026-01-22 23:06:00.123577459 +0000 UTC m=+0.058750343 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 23:06:00 compute-0 podman[246020]: 2026-01-22 23:06:00.123777464 +0000 UTC m=+0.055634186 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 23:06:00 compute-0 podman[246021]: 2026-01-22 23:06:00.1312993 +0000 UTC m=+0.060721091 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 23:06:00 compute-0 nova_compute[182725]: 2026-01-22 23:06:00.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:01 compute-0 nova_compute[182725]: 2026-01-22 23:06:01.310 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:03 compute-0 nova_compute[182725]: 2026-01-22 23:06:03.072 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:06 compute-0 nova_compute[182725]: 2026-01-22 23:06:06.311 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:08 compute-0 nova_compute[182725]: 2026-01-22 23:06:08.124 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:06:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:06:11 compute-0 nova_compute[182725]: 2026-01-22 23:06:11.313 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:12 compute-0 podman[246083]: 2026-01-22 23:06:12.108692611 +0000 UTC m=+0.049781622 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 23:06:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:06:12.477 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:06:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:06:12.477 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:06:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:06:12.477 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:06:13 compute-0 nova_compute[182725]: 2026-01-22 23:06:13.166 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:15 compute-0 nova_compute[182725]: 2026-01-22 23:06:15.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:16 compute-0 nova_compute[182725]: 2026-01-22 23:06:16.314 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:18 compute-0 podman[246104]: 2026-01-22 23:06:18.135714252 +0000 UTC m=+0.065198502 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 23:06:18 compute-0 nova_compute[182725]: 2026-01-22 23:06:18.167 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:18 compute-0 podman[246103]: 2026-01-22 23:06:18.168445261 +0000 UTC m=+0.103027737 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 23:06:21 compute-0 nova_compute[182725]: 2026-01-22 23:06:21.317 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:23 compute-0 nova_compute[182725]: 2026-01-22 23:06:23.169 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:26 compute-0 nova_compute[182725]: 2026-01-22 23:06:26.320 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:28 compute-0 nova_compute[182725]: 2026-01-22 23:06:28.199 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:31 compute-0 podman[246149]: 2026-01-22 23:06:31.131417378 +0000 UTC m=+0.060596159 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 23:06:31 compute-0 podman[246150]: 2026-01-22 23:06:31.14852282 +0000 UTC m=+0.062334651 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:06:31 compute-0 podman[246148]: 2026-01-22 23:06:31.174860001 +0000 UTC m=+0.102486604 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:06:31 compute-0 nova_compute[182725]: 2026-01-22 23:06:31.320 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:33 compute-0 nova_compute[182725]: 2026-01-22 23:06:33.200 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:36 compute-0 nova_compute[182725]: 2026-01-22 23:06:36.322 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:38 compute-0 nova_compute[182725]: 2026-01-22 23:06:38.202 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:41 compute-0 nova_compute[182725]: 2026-01-22 23:06:41.340 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:43 compute-0 podman[246214]: 2026-01-22 23:06:43.150770253 +0000 UTC m=+0.081805073 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 23:06:43 compute-0 nova_compute[182725]: 2026-01-22 23:06:43.206 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:45 compute-0 nova_compute[182725]: 2026-01-22 23:06:45.911 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:46 compute-0 nova_compute[182725]: 2026-01-22 23:06:46.342 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:48 compute-0 nova_compute[182725]: 2026-01-22 23:06:48.211 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:48 compute-0 podman[246235]: 2026-01-22 23:06:48.303307854 +0000 UTC m=+0.059447640 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9)
Jan 22 23:06:48 compute-0 podman[246234]: 2026-01-22 23:06:48.356671163 +0000 UTC m=+0.107600670 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 23:06:50 compute-0 nova_compute[182725]: 2026-01-22 23:06:50.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:50 compute-0 nova_compute[182725]: 2026-01-22 23:06:50.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:06:50 compute-0 nova_compute[182725]: 2026-01-22 23:06:50.891 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:06:50 compute-0 nova_compute[182725]: 2026-01-22 23:06:50.913 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:06:50 compute-0 nova_compute[182725]: 2026-01-22 23:06:50.914 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:51 compute-0 nova_compute[182725]: 2026-01-22 23:06:51.383 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:52 compute-0 nova_compute[182725]: 2026-01-22 23:06:52.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:53 compute-0 nova_compute[182725]: 2026-01-22 23:06:53.214 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:55 compute-0 nova_compute[182725]: 2026-01-22 23:06:55.892 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:55 compute-0 nova_compute[182725]: 2026-01-22 23:06:55.893 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:06:56 compute-0 nova_compute[182725]: 2026-01-22 23:06:56.385 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:56 compute-0 nova_compute[182725]: 2026-01-22 23:06:56.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:58 compute-0 nova_compute[182725]: 2026-01-22 23:06:58.257 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:06:58 compute-0 nova_compute[182725]: 2026-01-22 23:06:58.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:58 compute-0 nova_compute[182725]: 2026-01-22 23:06:58.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:06:58 compute-0 nova_compute[182725]: 2026-01-22 23:06:58.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:06:58 compute-0 nova_compute[182725]: 2026-01-22 23:06:58.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:06:58 compute-0 nova_compute[182725]: 2026-01-22 23:06:58.914 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:06:58 compute-0 nova_compute[182725]: 2026-01-22 23:06:58.914 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.062 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.063 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5654MB free_disk=73.3154296875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.063 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.064 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.122 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.122 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.645 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.685 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.685 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.710 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.743 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.778 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.805 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.807 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:06:59 compute-0 nova_compute[182725]: 2026-01-22 23:06:59.808 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:07:01 compute-0 nova_compute[182725]: 2026-01-22 23:07:01.386 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:02 compute-0 podman[246282]: 2026-01-22 23:07:02.125174895 +0000 UTC m=+0.056969319 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 23:07:02 compute-0 podman[246281]: 2026-01-22 23:07:02.139515879 +0000 UTC m=+0.069405006 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:07:02 compute-0 podman[246283]: 2026-01-22 23:07:02.139869618 +0000 UTC m=+0.057704477 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 23:07:03 compute-0 nova_compute[182725]: 2026-01-22 23:07:03.260 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:03 compute-0 nova_compute[182725]: 2026-01-22 23:07:03.807 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:06 compute-0 nova_compute[182725]: 2026-01-22 23:07:06.387 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:08 compute-0 nova_compute[182725]: 2026-01-22 23:07:08.261 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:11 compute-0 nova_compute[182725]: 2026-01-22 23:07:11.390 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:07:12.478 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:07:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:07:12.479 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:07:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:07:12.479 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:07:13 compute-0 nova_compute[182725]: 2026-01-22 23:07:13.263 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:14 compute-0 podman[246346]: 2026-01-22 23:07:14.121290759 +0000 UTC m=+0.062166756 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 23:07:16 compute-0 nova_compute[182725]: 2026-01-22 23:07:16.437 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:18 compute-0 nova_compute[182725]: 2026-01-22 23:07:18.265 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:19 compute-0 podman[246367]: 2026-01-22 23:07:19.128730652 +0000 UTC m=+0.059885130 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git)
Jan 22 23:07:19 compute-0 podman[246366]: 2026-01-22 23:07:19.181584139 +0000 UTC m=+0.114017839 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 22 23:07:21 compute-0 nova_compute[182725]: 2026-01-22 23:07:21.438 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:23 compute-0 nova_compute[182725]: 2026-01-22 23:07:23.266 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:26 compute-0 nova_compute[182725]: 2026-01-22 23:07:26.440 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:28 compute-0 nova_compute[182725]: 2026-01-22 23:07:28.268 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:31 compute-0 nova_compute[182725]: 2026-01-22 23:07:31.443 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:33 compute-0 podman[246411]: 2026-01-22 23:07:33.109007148 +0000 UTC m=+0.049814102 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:07:33 compute-0 podman[246418]: 2026-01-22 23:07:33.12852784 +0000 UTC m=+0.056089477 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 23:07:33 compute-0 podman[246412]: 2026-01-22 23:07:33.12852582 +0000 UTC m=+0.063649314 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:07:33 compute-0 nova_compute[182725]: 2026-01-22 23:07:33.270 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:36 compute-0 nova_compute[182725]: 2026-01-22 23:07:36.445 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:38 compute-0 nova_compute[182725]: 2026-01-22 23:07:38.273 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:41 compute-0 nova_compute[182725]: 2026-01-22 23:07:41.447 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:41 compute-0 nova_compute[182725]: 2026-01-22 23:07:41.493 182729 DEBUG oslo_concurrency.processutils [None req-99e82455-173d-451a-9c58-52a7cd7c6520 c792c8e8aa0d49e0a31a292ed9d309f4 6912a9182ac44bb486092f7ccd64d58c - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 23:07:41 compute-0 nova_compute[182725]: 2026-01-22 23:07:41.526 182729 DEBUG oslo_concurrency.processutils [None req-99e82455-173d-451a-9c58-52a7cd7c6520 c792c8e8aa0d49e0a31a292ed9d309f4 6912a9182ac44bb486092f7ccd64d58c - - default default] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 23:07:43 compute-0 nova_compute[182725]: 2026-01-22 23:07:43.275 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:45 compute-0 podman[246475]: 2026-01-22 23:07:45.115274481 +0000 UTC m=+0.053931834 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 22 23:07:46 compute-0 nova_compute[182725]: 2026-01-22 23:07:46.501 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:46 compute-0 nova_compute[182725]: 2026-01-22 23:07:46.892 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:47 compute-0 nova_compute[182725]: 2026-01-22 23:07:47.848 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:07:47.847 104215 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 23:07:47 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:07:47.849 104215 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 23:07:48 compute-0 nova_compute[182725]: 2026-01-22 23:07:48.277 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:50 compute-0 podman[246496]: 2026-01-22 23:07:50.156077112 +0000 UTC m=+0.065151051 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 23:07:50 compute-0 podman[246495]: 2026-01-22 23:07:50.19565648 +0000 UTC m=+0.111188169 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 23:07:51 compute-0 nova_compute[182725]: 2026-01-22 23:07:51.502 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:51 compute-0 nova_compute[182725]: 2026-01-22 23:07:51.893 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:51 compute-0 nova_compute[182725]: 2026-01-22 23:07:51.894 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:07:51 compute-0 nova_compute[182725]: 2026-01-22 23:07:51.894 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:07:51 compute-0 nova_compute[182725]: 2026-01-22 23:07:51.921 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:07:51 compute-0 nova_compute[182725]: 2026-01-22 23:07:51.921 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:53 compute-0 nova_compute[182725]: 2026-01-22 23:07:53.281 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:53 compute-0 nova_compute[182725]: 2026-01-22 23:07:53.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:55 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:07:55.852 104215 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 23:07:56 compute-0 nova_compute[182725]: 2026-01-22 23:07:56.504 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:56 compute-0 nova_compute[182725]: 2026-01-22 23:07:56.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:56 compute-0 nova_compute[182725]: 2026-01-22 23:07:56.891 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:07:58 compute-0 nova_compute[182725]: 2026-01-22 23:07:58.285 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:07:58 compute-0 nova_compute[182725]: 2026-01-22 23:07:58.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:58 compute-0 nova_compute[182725]: 2026-01-22 23:07:58.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:07:58 compute-0 nova_compute[182725]: 2026-01-22 23:07:58.971 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:07:58 compute-0 nova_compute[182725]: 2026-01-22 23:07:58.972 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:07:58 compute-0 nova_compute[182725]: 2026-01-22 23:07:58.972 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:07:58 compute-0 nova_compute[182725]: 2026-01-22 23:07:58.973 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.135 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.136 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5657MB free_disk=73.31587219238281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.136 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.137 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.692 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.692 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.765 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.787 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.789 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:07:59 compute-0 nova_compute[182725]: 2026-01-22 23:07:59.789 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:08:01 compute-0 nova_compute[182725]: 2026-01-22 23:08:01.506 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:01 compute-0 nova_compute[182725]: 2026-01-22 23:08:01.787 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:03 compute-0 nova_compute[182725]: 2026-01-22 23:08:03.288 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:03 compute-0 nova_compute[182725]: 2026-01-22 23:08:03.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:04 compute-0 podman[246539]: 2026-01-22 23:08:04.120199928 +0000 UTC m=+0.054960279 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 23:08:04 compute-0 podman[246540]: 2026-01-22 23:08:04.1303945 +0000 UTC m=+0.056903137 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 23:08:04 compute-0 podman[246541]: 2026-01-22 23:08:04.149551903 +0000 UTC m=+0.065808967 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 23:08:06 compute-0 nova_compute[182725]: 2026-01-22 23:08:06.508 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:08 compute-0 nova_compute[182725]: 2026-01-22 23:08:08.291 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.119 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:08:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:08:11 compute-0 nova_compute[182725]: 2026-01-22 23:08:11.511 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:08:12.480 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:08:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:08:12.480 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:08:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:08:12.481 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:08:13 compute-0 nova_compute[182725]: 2026-01-22 23:08:13.294 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:16 compute-0 podman[246606]: 2026-01-22 23:08:16.163578218 +0000 UTC m=+0.087969655 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 23:08:16 compute-0 nova_compute[182725]: 2026-01-22 23:08:16.557 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:18 compute-0 nova_compute[182725]: 2026-01-22 23:08:18.295 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:18 compute-0 nova_compute[182725]: 2026-01-22 23:08:18.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:21 compute-0 podman[246628]: 2026-01-22 23:08:21.161753026 +0000 UTC m=+0.080538271 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, version=9.6, config_id=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Jan 22 23:08:21 compute-0 podman[246627]: 2026-01-22 23:08:21.166416081 +0000 UTC m=+0.098168777 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 23:08:21 compute-0 nova_compute[182725]: 2026-01-22 23:08:21.558 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:23 compute-0 nova_compute[182725]: 2026-01-22 23:08:23.298 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:26 compute-0 nova_compute[182725]: 2026-01-22 23:08:26.562 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:28 compute-0 nova_compute[182725]: 2026-01-22 23:08:28.301 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:31 compute-0 nova_compute[182725]: 2026-01-22 23:08:31.563 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:33 compute-0 nova_compute[182725]: 2026-01-22 23:08:33.305 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:35 compute-0 podman[246675]: 2026-01-22 23:08:35.127567146 +0000 UTC m=+0.063616523 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 23:08:35 compute-0 podman[246676]: 2026-01-22 23:08:35.129836162 +0000 UTC m=+0.059310757 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:08:35 compute-0 podman[246674]: 2026-01-22 23:08:35.129931184 +0000 UTC m=+0.057803379 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:08:36 compute-0 nova_compute[182725]: 2026-01-22 23:08:36.565 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:38 compute-0 nova_compute[182725]: 2026-01-22 23:08:38.307 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:41 compute-0 nova_compute[182725]: 2026-01-22 23:08:41.567 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:43 compute-0 nova_compute[182725]: 2026-01-22 23:08:43.309 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:46 compute-0 nova_compute[182725]: 2026-01-22 23:08:46.570 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:47 compute-0 podman[246736]: 2026-01-22 23:08:47.143711194 +0000 UTC m=+0.077171838 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 23:08:48 compute-0 nova_compute[182725]: 2026-01-22 23:08:48.311 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:49 compute-0 nova_compute[182725]: 2026-01-22 23:08:49.036 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:51 compute-0 nova_compute[182725]: 2026-01-22 23:08:51.573 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:51 compute-0 nova_compute[182725]: 2026-01-22 23:08:51.893 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:51 compute-0 nova_compute[182725]: 2026-01-22 23:08:51.894 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:08:51 compute-0 nova_compute[182725]: 2026-01-22 23:08:51.894 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:08:52 compute-0 podman[246757]: 2026-01-22 23:08:52.141916751 +0000 UTC m=+0.073049466 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 23:08:52 compute-0 podman[246756]: 2026-01-22 23:08:52.169289768 +0000 UTC m=+0.106169875 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 23:08:53 compute-0 nova_compute[182725]: 2026-01-22 23:08:53.315 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:55 compute-0 nova_compute[182725]: 2026-01-22 23:08:55.521 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:08:55 compute-0 nova_compute[182725]: 2026-01-22 23:08:55.521 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:55 compute-0 nova_compute[182725]: 2026-01-22 23:08:55.522 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:56 compute-0 nova_compute[182725]: 2026-01-22 23:08:56.575 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:58 compute-0 nova_compute[182725]: 2026-01-22 23:08:58.317 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:08:58 compute-0 nova_compute[182725]: 2026-01-22 23:08:58.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:58 compute-0 nova_compute[182725]: 2026-01-22 23:08:58.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:08:58 compute-0 nova_compute[182725]: 2026-01-22 23:08:58.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:08:59 compute-0 nova_compute[182725]: 2026-01-22 23:08:59.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:00 compute-0 nova_compute[182725]: 2026-01-22 23:09:00.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:00 compute-0 nova_compute[182725]: 2026-01-22 23:09:00.944 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:09:00 compute-0 nova_compute[182725]: 2026-01-22 23:09:00.944 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:09:00 compute-0 nova_compute[182725]: 2026-01-22 23:09:00.944 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:09:00 compute-0 nova_compute[182725]: 2026-01-22 23:09:00.944 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.113 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.114 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5667MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.115 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.115 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.198 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.199 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.366 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.395 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.398 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.399 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:09:01 compute-0 nova_compute[182725]: 2026-01-22 23:09:01.577 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:03 compute-0 nova_compute[182725]: 2026-01-22 23:09:03.320 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:03 compute-0 nova_compute[182725]: 2026-01-22 23:09:03.399 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:04 compute-0 nova_compute[182725]: 2026-01-22 23:09:04.890 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:06 compute-0 podman[246806]: 2026-01-22 23:09:06.11464268 +0000 UTC m=+0.047173526 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 23:09:06 compute-0 podman[246807]: 2026-01-22 23:09:06.133616629 +0000 UTC m=+0.053941204 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 23:09:06 compute-0 podman[246805]: 2026-01-22 23:09:06.145223106 +0000 UTC m=+0.077009374 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:09:06 compute-0 nova_compute[182725]: 2026-01-22 23:09:06.578 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:08 compute-0 nova_compute[182725]: 2026-01-22 23:09:08.321 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:11 compute-0 nova_compute[182725]: 2026-01-22 23:09:11.580 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:09:12.481 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:09:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:09:12.481 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:09:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:09:12.481 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:09:13 compute-0 nova_compute[182725]: 2026-01-22 23:09:13.324 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:16 compute-0 nova_compute[182725]: 2026-01-22 23:09:16.582 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:18 compute-0 podman[246872]: 2026-01-22 23:09:18.139176738 +0000 UTC m=+0.072265037 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:09:18 compute-0 nova_compute[182725]: 2026-01-22 23:09:18.327 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:21 compute-0 nova_compute[182725]: 2026-01-22 23:09:21.585 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:23 compute-0 podman[246893]: 2026-01-22 23:09:23.125448521 +0000 UTC m=+0.052275193 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 22 23:09:23 compute-0 podman[246892]: 2026-01-22 23:09:23.157032191 +0000 UTC m=+0.086732574 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 23:09:23 compute-0 nova_compute[182725]: 2026-01-22 23:09:23.328 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:26 compute-0 nova_compute[182725]: 2026-01-22 23:09:26.587 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:28 compute-0 nova_compute[182725]: 2026-01-22 23:09:28.365 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:31 compute-0 nova_compute[182725]: 2026-01-22 23:09:31.590 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:33 compute-0 nova_compute[182725]: 2026-01-22 23:09:33.368 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:36 compute-0 nova_compute[182725]: 2026-01-22 23:09:36.614 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:37 compute-0 podman[246936]: 2026-01-22 23:09:37.134055016 +0000 UTC m=+0.058966778 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 23:09:37 compute-0 podman[246935]: 2026-01-22 23:09:37.137865341 +0000 UTC m=+0.066678749 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 23:09:37 compute-0 podman[246934]: 2026-01-22 23:09:37.162464319 +0000 UTC m=+0.085953886 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:09:38 compute-0 nova_compute[182725]: 2026-01-22 23:09:38.371 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:41 compute-0 nova_compute[182725]: 2026-01-22 23:09:41.617 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:43 compute-0 nova_compute[182725]: 2026-01-22 23:09:43.373 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:46 compute-0 nova_compute[182725]: 2026-01-22 23:09:46.618 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:48 compute-0 nova_compute[182725]: 2026-01-22 23:09:48.376 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:48 compute-0 podman[247000]: 2026-01-22 23:09:48.454716498 +0000 UTC m=+0.053891643 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 23:09:48 compute-0 nova_compute[182725]: 2026-01-22 23:09:48.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:51 compute-0 nova_compute[182725]: 2026-01-22 23:09:51.658 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:53 compute-0 nova_compute[182725]: 2026-01-22 23:09:53.378 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:53 compute-0 nova_compute[182725]: 2026-01-22 23:09:53.899 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:53 compute-0 nova_compute[182725]: 2026-01-22 23:09:53.899 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:09:53 compute-0 nova_compute[182725]: 2026-01-22 23:09:53.899 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:09:53 compute-0 nova_compute[182725]: 2026-01-22 23:09:53.915 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:09:53 compute-0 nova_compute[182725]: 2026-01-22 23:09:53.915 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:54 compute-0 podman[247021]: 2026-01-22 23:09:54.123996909 +0000 UTC m=+0.057073861 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 23:09:54 compute-0 podman[247020]: 2026-01-22 23:09:54.151092939 +0000 UTC m=+0.086184011 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 23:09:54 compute-0 nova_compute[182725]: 2026-01-22 23:09:54.897 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:56 compute-0 nova_compute[182725]: 2026-01-22 23:09:56.662 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:58 compute-0 nova_compute[182725]: 2026-01-22 23:09:58.380 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:09:58 compute-0 nova_compute[182725]: 2026-01-22 23:09:58.894 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:58 compute-0 nova_compute[182725]: 2026-01-22 23:09:58.894 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:09:59 compute-0 nova_compute[182725]: 2026-01-22 23:09:59.893 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:09:59 compute-0 nova_compute[182725]: 2026-01-22 23:09:59.894 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 23:10:00 compute-0 nova_compute[182725]: 2026-01-22 23:10:00.909 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:00 compute-0 nova_compute[182725]: 2026-01-22 23:10:00.909 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:00 compute-0 nova_compute[182725]: 2026-01-22 23:10:00.933 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:10:00 compute-0 nova_compute[182725]: 2026-01-22 23:10:00.933 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:10:00 compute-0 nova_compute[182725]: 2026-01-22 23:10:00.934 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:10:00 compute-0 nova_compute[182725]: 2026-01-22 23:10:00.934 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.120 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.121 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5661MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.121 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.121 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.169 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.169 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.189 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.202 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.203 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.203 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:10:01 compute-0 nova_compute[182725]: 2026-01-22 23:10:01.667 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:03 compute-0 nova_compute[182725]: 2026-01-22 23:10:03.383 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:04 compute-0 nova_compute[182725]: 2026-01-22 23:10:04.184 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:04 compute-0 nova_compute[182725]: 2026-01-22 23:10:04.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:06 compute-0 nova_compute[182725]: 2026-01-22 23:10:06.669 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:08 compute-0 podman[247070]: 2026-01-22 23:10:08.139559048 +0000 UTC m=+0.051990746 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 23:10:08 compute-0 podman[247069]: 2026-01-22 23:10:08.140297886 +0000 UTC m=+0.071386615 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:10:08 compute-0 podman[247076]: 2026-01-22 23:10:08.153615965 +0000 UTC m=+0.064346941 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 23:10:08 compute-0 nova_compute[182725]: 2026-01-22 23:10:08.387 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:10:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:10:11 compute-0 nova_compute[182725]: 2026-01-22 23:10:11.673 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:10:12.482 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:10:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:10:12.482 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:10:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:10:12.482 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:10:13 compute-0 nova_compute[182725]: 2026-01-22 23:10:13.390 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:16 compute-0 nova_compute[182725]: 2026-01-22 23:10:16.675 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:18 compute-0 nova_compute[182725]: 2026-01-22 23:10:18.393 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:19 compute-0 podman[247134]: 2026-01-22 23:10:19.192671615 +0000 UTC m=+0.116507291 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:10:21 compute-0 nova_compute[182725]: 2026-01-22 23:10:21.677 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:22 compute-0 nova_compute[182725]: 2026-01-22 23:10:22.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:23 compute-0 nova_compute[182725]: 2026-01-22 23:10:23.397 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:25 compute-0 podman[247156]: 2026-01-22 23:10:25.139734181 +0000 UTC m=+0.068995696 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41)
Jan 22 23:10:25 compute-0 podman[247155]: 2026-01-22 23:10:25.173484965 +0000 UTC m=+0.107268752 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 23:10:26 compute-0 nova_compute[182725]: 2026-01-22 23:10:26.679 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:27 compute-0 nova_compute[182725]: 2026-01-22 23:10:27.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:28 compute-0 nova_compute[182725]: 2026-01-22 23:10:28.401 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:31 compute-0 nova_compute[182725]: 2026-01-22 23:10:31.680 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:33 compute-0 nova_compute[182725]: 2026-01-22 23:10:33.404 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:34 compute-0 nova_compute[182725]: 2026-01-22 23:10:34.906 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:34 compute-0 nova_compute[182725]: 2026-01-22 23:10:34.907 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 23:10:35 compute-0 nova_compute[182725]: 2026-01-22 23:10:35.041 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 23:10:36 compute-0 nova_compute[182725]: 2026-01-22 23:10:36.682 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:37 compute-0 nova_compute[182725]: 2026-01-22 23:10:37.013 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:38 compute-0 nova_compute[182725]: 2026-01-22 23:10:38.407 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:39 compute-0 podman[247203]: 2026-01-22 23:10:39.148923102 +0000 UTC m=+0.073791505 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:10:39 compute-0 podman[247202]: 2026-01-22 23:10:39.150633304 +0000 UTC m=+0.075677141 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 23:10:39 compute-0 podman[247204]: 2026-01-22 23:10:39.170768311 +0000 UTC m=+0.082211352 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 23:10:41 compute-0 nova_compute[182725]: 2026-01-22 23:10:41.684 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:43 compute-0 nova_compute[182725]: 2026-01-22 23:10:43.410 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:46 compute-0 nova_compute[182725]: 2026-01-22 23:10:46.687 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:48 compute-0 nova_compute[182725]: 2026-01-22 23:10:48.413 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:48 compute-0 nova_compute[182725]: 2026-01-22 23:10:48.885 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:50 compute-0 podman[247266]: 2026-01-22 23:10:50.148032879 +0000 UTC m=+0.069574038 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 23:10:50 compute-0 podman[198588]: time="2026-01-22T23:10:50Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 22 23:10:50 compute-0 podman[198588]: @ - - [22/Jan/2026:23:10:50 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 21519 "" "Go-http-client/1.1"
Jan 22 23:10:51 compute-0 nova_compute[182725]: 2026-01-22 23:10:51.687 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:53 compute-0 nova_compute[182725]: 2026-01-22 23:10:53.415 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:54 compute-0 nova_compute[182725]: 2026-01-22 23:10:54.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:55 compute-0 nova_compute[182725]: 2026-01-22 23:10:55.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:55 compute-0 nova_compute[182725]: 2026-01-22 23:10:55.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:10:55 compute-0 nova_compute[182725]: 2026-01-22 23:10:55.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:10:55 compute-0 nova_compute[182725]: 2026-01-22 23:10:55.903 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:10:55 compute-0 nova_compute[182725]: 2026-01-22 23:10:55.904 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:56 compute-0 podman[247288]: 2026-01-22 23:10:56.114003748 +0000 UTC m=+0.045101963 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41)
Jan 22 23:10:56 compute-0 podman[247287]: 2026-01-22 23:10:56.134515755 +0000 UTC m=+0.070529892 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 23:10:56 compute-0 nova_compute[182725]: 2026-01-22 23:10:56.688 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:58 compute-0 nova_compute[182725]: 2026-01-22 23:10:58.418 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:10:58 compute-0 nova_compute[182725]: 2026-01-22 23:10:58.894 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:10:58 compute-0 nova_compute[182725]: 2026-01-22 23:10:58.894 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:11:01 compute-0 nova_compute[182725]: 2026-01-22 23:11:01.690 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:02 compute-0 nova_compute[182725]: 2026-01-22 23:11:02.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:02 compute-0 nova_compute[182725]: 2026-01-22 23:11:02.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:02 compute-0 nova_compute[182725]: 2026-01-22 23:11:02.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:11:02 compute-0 nova_compute[182725]: 2026-01-22 23:11:02.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:11:02 compute-0 nova_compute[182725]: 2026-01-22 23:11:02.917 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:11:02 compute-0 nova_compute[182725]: 2026-01-22 23:11:02.918 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.046 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.047 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5668MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.047 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.047 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.103 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.103 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.126 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.140 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.142 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.142 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:11:03 compute-0 nova_compute[182725]: 2026-01-22 23:11:03.420 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:04 compute-0 nova_compute[182725]: 2026-01-22 23:11:04.147 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:05 compute-0 nova_compute[182725]: 2026-01-22 23:11:05.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:06 compute-0 nova_compute[182725]: 2026-01-22 23:11:06.694 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:08 compute-0 nova_compute[182725]: 2026-01-22 23:11:08.424 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:10 compute-0 podman[247332]: 2026-01-22 23:11:10.123988437 +0000 UTC m=+0.045062733 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 23:11:10 compute-0 podman[247333]: 2026-01-22 23:11:10.149815895 +0000 UTC m=+0.062396141 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 23:11:10 compute-0 podman[247331]: 2026-01-22 23:11:10.157556096 +0000 UTC m=+0.074723825 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:11:11 compute-0 nova_compute[182725]: 2026-01-22 23:11:11.696 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:11:12.483 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:11:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:11:12.483 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:11:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:11:12.483 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:11:13 compute-0 nova_compute[182725]: 2026-01-22 23:11:13.428 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:16 compute-0 nova_compute[182725]: 2026-01-22 23:11:16.726 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:18 compute-0 nova_compute[182725]: 2026-01-22 23:11:18.432 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:21 compute-0 podman[247397]: 2026-01-22 23:11:21.124481756 +0000 UTC m=+0.058239108 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 23:11:21 compute-0 nova_compute[182725]: 2026-01-22 23:11:21.730 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:23 compute-0 nova_compute[182725]: 2026-01-22 23:11:23.436 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:26 compute-0 nova_compute[182725]: 2026-01-22 23:11:26.731 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:27 compute-0 podman[247419]: 2026-01-22 23:11:27.158660216 +0000 UTC m=+0.079635637 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 22 23:11:27 compute-0 podman[247418]: 2026-01-22 23:11:27.182671759 +0000 UTC m=+0.118217440 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:11:28 compute-0 nova_compute[182725]: 2026-01-22 23:11:28.440 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:31 compute-0 nova_compute[182725]: 2026-01-22 23:11:31.732 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:33 compute-0 nova_compute[182725]: 2026-01-22 23:11:33.443 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:34 compute-0 nova_compute[182725]: 2026-01-22 23:11:34.824 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:36 compute-0 nova_compute[182725]: 2026-01-22 23:11:36.734 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:38 compute-0 nova_compute[182725]: 2026-01-22 23:11:38.446 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:41 compute-0 podman[247467]: 2026-01-22 23:11:41.112635521 +0000 UTC m=+0.046334245 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 23:11:41 compute-0 podman[247466]: 2026-01-22 23:11:41.128539424 +0000 UTC m=+0.054551588 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:11:41 compute-0 podman[247465]: 2026-01-22 23:11:41.141528644 +0000 UTC m=+0.082376364 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:11:41 compute-0 nova_compute[182725]: 2026-01-22 23:11:41.736 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:43 compute-0 nova_compute[182725]: 2026-01-22 23:11:43.449 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:46 compute-0 nova_compute[182725]: 2026-01-22 23:11:46.738 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:48 compute-0 nova_compute[182725]: 2026-01-22 23:11:48.452 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:49 compute-0 nova_compute[182725]: 2026-01-22 23:11:49.904 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:51 compute-0 nova_compute[182725]: 2026-01-22 23:11:51.741 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:52 compute-0 podman[247533]: 2026-01-22 23:11:52.114852454 +0000 UTC m=+0.052894327 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 23:11:53 compute-0 nova_compute[182725]: 2026-01-22 23:11:53.454 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:56 compute-0 nova_compute[182725]: 2026-01-22 23:11:56.778 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:11:56 compute-0 nova_compute[182725]: 2026-01-22 23:11:56.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:57 compute-0 nova_compute[182725]: 2026-01-22 23:11:57.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:57 compute-0 nova_compute[182725]: 2026-01-22 23:11:57.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:11:57 compute-0 nova_compute[182725]: 2026-01-22 23:11:57.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:11:57 compute-0 nova_compute[182725]: 2026-01-22 23:11:57.905 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:11:57 compute-0 nova_compute[182725]: 2026-01-22 23:11:57.905 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:11:58 compute-0 podman[247553]: 2026-01-22 23:11:58.128526219 +0000 UTC m=+0.068940093 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 23:11:58 compute-0 podman[247554]: 2026-01-22 23:11:58.134699452 +0000 UTC m=+0.067362414 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 22 23:11:58 compute-0 nova_compute[182725]: 2026-01-22 23:11:58.456 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:00 compute-0 nova_compute[182725]: 2026-01-22 23:12:00.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:00 compute-0 nova_compute[182725]: 2026-01-22 23:12:00.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:12:01 compute-0 nova_compute[182725]: 2026-01-22 23:12:01.780 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:02 compute-0 nova_compute[182725]: 2026-01-22 23:12:02.898 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:03 compute-0 nova_compute[182725]: 2026-01-22 23:12:03.471 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:03 compute-0 nova_compute[182725]: 2026-01-22 23:12:03.892 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:03 compute-0 nova_compute[182725]: 2026-01-22 23:12:03.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:12:03 compute-0 nova_compute[182725]: 2026-01-22 23:12:03.915 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:12:03 compute-0 nova_compute[182725]: 2026-01-22 23:12:03.916 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:12:03 compute-0 nova_compute[182725]: 2026-01-22 23:12:03.916 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.066 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.067 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5694MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.068 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.068 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.124 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.125 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.142 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.157 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.158 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.172 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.202 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.221 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.233 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.234 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:12:04 compute-0 nova_compute[182725]: 2026-01-22 23:12:04.234 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:12:06 compute-0 nova_compute[182725]: 2026-01-22 23:12:06.235 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:06 compute-0 nova_compute[182725]: 2026-01-22 23:12:06.782 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:06 compute-0 nova_compute[182725]: 2026-01-22 23:12:06.891 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:08 compute-0 nova_compute[182725]: 2026-01-22 23:12:08.479 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:12:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:12:11 compute-0 nova_compute[182725]: 2026-01-22 23:12:11.785 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:12 compute-0 podman[247601]: 2026-01-22 23:12:12.131062532 +0000 UTC m=+0.060182697 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 23:12:12 compute-0 podman[247600]: 2026-01-22 23:12:12.142137746 +0000 UTC m=+0.070795659 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:12:12 compute-0 podman[247602]: 2026-01-22 23:12:12.165543263 +0000 UTC m=+0.089266804 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:12:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:12:12.486 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:12:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:12:12.486 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:12:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:12:12.486 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:12:13 compute-0 nova_compute[182725]: 2026-01-22 23:12:13.483 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:16 compute-0 nova_compute[182725]: 2026-01-22 23:12:16.786 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:18 compute-0 nova_compute[182725]: 2026-01-22 23:12:18.486 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:21 compute-0 nova_compute[182725]: 2026-01-22 23:12:21.788 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:22 compute-0 nova_compute[182725]: 2026-01-22 23:12:22.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:23 compute-0 podman[247664]: 2026-01-22 23:12:23.15412941 +0000 UTC m=+0.088828373 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 23:12:23 compute-0 nova_compute[182725]: 2026-01-22 23:12:23.489 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:26 compute-0 nova_compute[182725]: 2026-01-22 23:12:26.832 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:28 compute-0 nova_compute[182725]: 2026-01-22 23:12:28.492 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:29 compute-0 podman[247684]: 2026-01-22 23:12:29.174099102 +0000 UTC m=+0.102447880 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 23:12:29 compute-0 podman[247685]: 2026-01-22 23:12:29.176818619 +0000 UTC m=+0.101782393 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Jan 22 23:12:31 compute-0 nova_compute[182725]: 2026-01-22 23:12:31.834 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:33 compute-0 nova_compute[182725]: 2026-01-22 23:12:33.496 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:36 compute-0 nova_compute[182725]: 2026-01-22 23:12:36.835 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:38 compute-0 nova_compute[182725]: 2026-01-22 23:12:38.500 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:41 compute-0 nova_compute[182725]: 2026-01-22 23:12:41.837 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:43 compute-0 podman[247733]: 2026-01-22 23:12:43.148941523 +0000 UTC m=+0.070420030 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 23:12:43 compute-0 podman[247734]: 2026-01-22 23:12:43.151529446 +0000 UTC m=+0.060915874 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 23:12:43 compute-0 podman[247740]: 2026-01-22 23:12:43.154873689 +0000 UTC m=+0.059315455 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 23:12:43 compute-0 nova_compute[182725]: 2026-01-22 23:12:43.502 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:46 compute-0 nova_compute[182725]: 2026-01-22 23:12:46.869 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:48 compute-0 nova_compute[182725]: 2026-01-22 23:12:48.505 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:51 compute-0 nova_compute[182725]: 2026-01-22 23:12:51.872 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:51 compute-0 nova_compute[182725]: 2026-01-22 23:12:51.907 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:53 compute-0 nova_compute[182725]: 2026-01-22 23:12:53.511 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:54 compute-0 podman[247799]: 2026-01-22 23:12:54.133772905 +0000 UTC m=+0.067567308 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:12:56 compute-0 nova_compute[182725]: 2026-01-22 23:12:56.873 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:58 compute-0 nova_compute[182725]: 2026-01-22 23:12:58.524 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:12:58 compute-0 nova_compute[182725]: 2026-01-22 23:12:58.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:58 compute-0 nova_compute[182725]: 2026-01-22 23:12:58.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:59 compute-0 nova_compute[182725]: 2026-01-22 23:12:59.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:12:59 compute-0 nova_compute[182725]: 2026-01-22 23:12:59.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:12:59 compute-0 nova_compute[182725]: 2026-01-22 23:12:59.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:12:59 compute-0 nova_compute[182725]: 2026-01-22 23:12:59.905 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:13:00 compute-0 podman[247820]: 2026-01-22 23:13:00.157768035 +0000 UTC m=+0.083619134 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 23:13:00 compute-0 podman[247819]: 2026-01-22 23:13:00.183309985 +0000 UTC m=+0.106756805 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 23:13:01 compute-0 nova_compute[182725]: 2026-01-22 23:13:01.925 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:02 compute-0 nova_compute[182725]: 2026-01-22 23:13:02.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:13:02 compute-0 nova_compute[182725]: 2026-01-22 23:13:02.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:13:03 compute-0 nova_compute[182725]: 2026-01-22 23:13:03.527 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:03 compute-0 nova_compute[182725]: 2026-01-22 23:13:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:13:03 compute-0 nova_compute[182725]: 2026-01-22 23:13:03.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:13:03 compute-0 nova_compute[182725]: 2026-01-22 23:13:03.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:13:03 compute-0 nova_compute[182725]: 2026-01-22 23:13:03.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:13:03 compute-0 nova_compute[182725]: 2026-01-22 23:13:03.913 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:13:03 compute-0 nova_compute[182725]: 2026-01-22 23:13:03.914 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.109 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.110 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5672MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.111 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.111 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.164 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.165 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.250 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.269 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.270 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:13:04 compute-0 nova_compute[182725]: 2026-01-22 23:13:04.270 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:13:06 compute-0 nova_compute[182725]: 2026-01-22 23:13:06.271 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:13:06 compute-0 nova_compute[182725]: 2026-01-22 23:13:06.928 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:07 compute-0 nova_compute[182725]: 2026-01-22 23:13:07.892 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:13:08 compute-0 nova_compute[182725]: 2026-01-22 23:13:08.566 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:11 compute-0 nova_compute[182725]: 2026-01-22 23:13:11.933 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:13:12.487 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:13:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:13:12.487 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:13:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:13:12.488 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:13:13 compute-0 nova_compute[182725]: 2026-01-22 23:13:13.626 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:14 compute-0 podman[247866]: 2026-01-22 23:13:14.155514622 +0000 UTC m=+0.077087624 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:13:14 compute-0 podman[247867]: 2026-01-22 23:13:14.168664297 +0000 UTC m=+0.082487508 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 23:13:14 compute-0 podman[247868]: 2026-01-22 23:13:14.179616707 +0000 UTC m=+0.089335857 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 23:13:16 compute-0 nova_compute[182725]: 2026-01-22 23:13:16.939 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:18 compute-0 nova_compute[182725]: 2026-01-22 23:13:18.672 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:21 compute-0 nova_compute[182725]: 2026-01-22 23:13:21.994 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:23 compute-0 nova_compute[182725]: 2026-01-22 23:13:23.676 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:25 compute-0 podman[247930]: 2026-01-22 23:13:25.157691753 +0000 UTC m=+0.077767491 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 23:13:27 compute-0 nova_compute[182725]: 2026-01-22 23:13:27.026 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:28 compute-0 nova_compute[182725]: 2026-01-22 23:13:28.679 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:31 compute-0 podman[247951]: 2026-01-22 23:13:31.15850415 +0000 UTC m=+0.081805780 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Jan 22 23:13:31 compute-0 podman[247950]: 2026-01-22 23:13:31.17143846 +0000 UTC m=+0.100391190 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 23:13:32 compute-0 nova_compute[182725]: 2026-01-22 23:13:32.027 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:33 compute-0 nova_compute[182725]: 2026-01-22 23:13:33.683 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:37 compute-0 nova_compute[182725]: 2026-01-22 23:13:37.078 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:38 compute-0 nova_compute[182725]: 2026-01-22 23:13:38.686 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:42 compute-0 nova_compute[182725]: 2026-01-22 23:13:42.099 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:43 compute-0 nova_compute[182725]: 2026-01-22 23:13:43.690 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:45 compute-0 podman[248004]: 2026-01-22 23:13:45.356149853 +0000 UTC m=+0.064928784 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 23:13:45 compute-0 podman[247996]: 2026-01-22 23:13:45.369886033 +0000 UTC m=+0.091728706 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 23:13:45 compute-0 podman[247998]: 2026-01-22 23:13:45.375762448 +0000 UTC m=+0.091462289 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 23:13:47 compute-0 nova_compute[182725]: 2026-01-22 23:13:47.101 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:48 compute-0 nova_compute[182725]: 2026-01-22 23:13:48.694 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:52 compute-0 nova_compute[182725]: 2026-01-22 23:13:52.103 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:53 compute-0 nova_compute[182725]: 2026-01-22 23:13:53.737 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:53 compute-0 nova_compute[182725]: 2026-01-22 23:13:53.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:13:56 compute-0 podman[248061]: 2026-01-22 23:13:56.12600782 +0000 UTC m=+0.055986134 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 23:13:57 compute-0 nova_compute[182725]: 2026-01-22 23:13:57.104 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:58 compute-0 nova_compute[182725]: 2026-01-22 23:13:58.740 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:13:58 compute-0 nova_compute[182725]: 2026-01-22 23:13:58.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:13:59 compute-0 nova_compute[182725]: 2026-01-22 23:13:59.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:01 compute-0 nova_compute[182725]: 2026-01-22 23:14:01.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:01 compute-0 nova_compute[182725]: 2026-01-22 23:14:01.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:14:01 compute-0 nova_compute[182725]: 2026-01-22 23:14:01.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:14:01 compute-0 nova_compute[182725]: 2026-01-22 23:14:01.908 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:14:02 compute-0 nova_compute[182725]: 2026-01-22 23:14:02.105 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:02 compute-0 podman[248082]: 2026-01-22 23:14:02.136580739 +0000 UTC m=+0.078282433 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:14:02 compute-0 podman[248083]: 2026-01-22 23:14:02.141374218 +0000 UTC m=+0.069175689 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Jan 22 23:14:03 compute-0 nova_compute[182725]: 2026-01-22 23:14:03.744 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:03 compute-0 nova_compute[182725]: 2026-01-22 23:14:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:03 compute-0 nova_compute[182725]: 2026-01-22 23:14:03.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:14:05 compute-0 nova_compute[182725]: 2026-01-22 23:14:05.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:05 compute-0 nova_compute[182725]: 2026-01-22 23:14:05.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:05 compute-0 nova_compute[182725]: 2026-01-22 23:14:05.948 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:14:05 compute-0 nova_compute[182725]: 2026-01-22 23:14:05.948 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:14:05 compute-0 nova_compute[182725]: 2026-01-22 23:14:05.948 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:14:05 compute-0 nova_compute[182725]: 2026-01-22 23:14:05.949 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.100 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.102 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5689MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.102 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.102 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.166 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.166 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.285 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.302 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.303 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:14:06 compute-0 nova_compute[182725]: 2026-01-22 23:14:06.303 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:14:07 compute-0 nova_compute[182725]: 2026-01-22 23:14:07.107 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:07 compute-0 nova_compute[182725]: 2026-01-22 23:14:07.302 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:08 compute-0 nova_compute[182725]: 2026-01-22 23:14:08.747 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:08 compute-0 nova_compute[182725]: 2026-01-22 23:14:08.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:14:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:14:12 compute-0 nova_compute[182725]: 2026-01-22 23:14:12.109 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:14:12.488 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:14:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:14:12.488 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:14:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:14:12.489 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:14:13 compute-0 nova_compute[182725]: 2026-01-22 23:14:13.750 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:16 compute-0 podman[248128]: 2026-01-22 23:14:16.155775355 +0000 UTC m=+0.073245929 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 23:14:16 compute-0 podman[248127]: 2026-01-22 23:14:16.155946459 +0000 UTC m=+0.081564344 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 23:14:16 compute-0 podman[248129]: 2026-01-22 23:14:16.164722876 +0000 UTC m=+0.075704670 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:14:17 compute-0 nova_compute[182725]: 2026-01-22 23:14:17.112 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:18 compute-0 nova_compute[182725]: 2026-01-22 23:14:18.753 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:22 compute-0 nova_compute[182725]: 2026-01-22 23:14:22.114 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:23 compute-0 nova_compute[182725]: 2026-01-22 23:14:23.757 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:26 compute-0 nova_compute[182725]: 2026-01-22 23:14:26.884 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:27 compute-0 podman[248190]: 2026-01-22 23:14:27.116658688 +0000 UTC m=+0.056402613 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 23:14:27 compute-0 nova_compute[182725]: 2026-01-22 23:14:27.117 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:28 compute-0 nova_compute[182725]: 2026-01-22 23:14:28.759 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:32 compute-0 nova_compute[182725]: 2026-01-22 23:14:32.117 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:33 compute-0 podman[248211]: 2026-01-22 23:14:33.11348688 +0000 UTC m=+0.047460033 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 22 23:14:33 compute-0 podman[248210]: 2026-01-22 23:14:33.146530815 +0000 UTC m=+0.080263592 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 23:14:33 compute-0 nova_compute[182725]: 2026-01-22 23:14:33.761 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:37 compute-0 nova_compute[182725]: 2026-01-22 23:14:37.118 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:38 compute-0 nova_compute[182725]: 2026-01-22 23:14:38.765 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:42 compute-0 nova_compute[182725]: 2026-01-22 23:14:42.120 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:43 compute-0 nova_compute[182725]: 2026-01-22 23:14:43.768 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:47 compute-0 podman[248258]: 2026-01-22 23:14:47.120654956 +0000 UTC m=+0.053966484 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 23:14:47 compute-0 nova_compute[182725]: 2026-01-22 23:14:47.121 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:47 compute-0 podman[248259]: 2026-01-22 23:14:47.140640209 +0000 UTC m=+0.065355014 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 23:14:47 compute-0 podman[248260]: 2026-01-22 23:14:47.154653215 +0000 UTC m=+0.077673888 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 23:14:48 compute-0 nova_compute[182725]: 2026-01-22 23:14:48.771 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:52 compute-0 nova_compute[182725]: 2026-01-22 23:14:52.126 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:53 compute-0 nova_compute[182725]: 2026-01-22 23:14:53.774 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:53 compute-0 nova_compute[182725]: 2026-01-22 23:14:53.900 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:14:57 compute-0 nova_compute[182725]: 2026-01-22 23:14:57.127 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:58 compute-0 podman[248322]: 2026-01-22 23:14:58.146134202 +0000 UTC m=+0.075198407 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 23:14:58 compute-0 nova_compute[182725]: 2026-01-22 23:14:58.777 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:14:59 compute-0 nova_compute[182725]: 2026-01-22 23:14:59.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:01 compute-0 nova_compute[182725]: 2026-01-22 23:15:01.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:02 compute-0 nova_compute[182725]: 2026-01-22 23:15:02.128 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:02 compute-0 nova_compute[182725]: 2026-01-22 23:15:02.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:02 compute-0 nova_compute[182725]: 2026-01-22 23:15:02.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:15:02 compute-0 nova_compute[182725]: 2026-01-22 23:15:02.890 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:15:02 compute-0 nova_compute[182725]: 2026-01-22 23:15:02.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:15:02 compute-0 nova_compute[182725]: 2026-01-22 23:15:02.905 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:02 compute-0 nova_compute[182725]: 2026-01-22 23:15:02.905 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 23:15:03 compute-0 nova_compute[182725]: 2026-01-22 23:15:03.817 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:04 compute-0 podman[248343]: 2026-01-22 23:15:04.114637692 +0000 UTC m=+0.051171994 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Jan 22 23:15:04 compute-0 podman[248342]: 2026-01-22 23:15:04.134838751 +0000 UTC m=+0.075925185 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 23:15:05 compute-0 nova_compute[182725]: 2026-01-22 23:15:05.898 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:05 compute-0 nova_compute[182725]: 2026-01-22 23:15:05.898 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:15:05 compute-0 nova_compute[182725]: 2026-01-22 23:15:05.899 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:05 compute-0 nova_compute[182725]: 2026-01-22 23:15:05.950 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:15:05 compute-0 nova_compute[182725]: 2026-01-22 23:15:05.951 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:15:05 compute-0 nova_compute[182725]: 2026-01-22 23:15:05.951 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:15:05 compute-0 nova_compute[182725]: 2026-01-22 23:15:05.952 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.095 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.096 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5688MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.096 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.096 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.160 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.161 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.345 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.361 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.363 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:15:06 compute-0 nova_compute[182725]: 2026-01-22 23:15:06.363 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:15:07 compute-0 nova_compute[182725]: 2026-01-22 23:15:07.130 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:07 compute-0 nova_compute[182725]: 2026-01-22 23:15:07.353 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:08 compute-0 nova_compute[182725]: 2026-01-22 23:15:08.820 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:08 compute-0 nova_compute[182725]: 2026-01-22 23:15:08.887 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:10 compute-0 nova_compute[182725]: 2026-01-22 23:15:10.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:12 compute-0 nova_compute[182725]: 2026-01-22 23:15:12.132 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:15:12.489 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:15:12.490 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:15:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:15:12.490 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:15:13 compute-0 nova_compute[182725]: 2026-01-22 23:15:13.879 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:17 compute-0 nova_compute[182725]: 2026-01-22 23:15:17.133 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:18 compute-0 podman[248394]: 2026-01-22 23:15:18.138602016 +0000 UTC m=+0.055643555 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:15:18 compute-0 podman[248388]: 2026-01-22 23:15:18.139052897 +0000 UTC m=+0.059621013 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 23:15:18 compute-0 podman[248387]: 2026-01-22 23:15:18.147374922 +0000 UTC m=+0.078737275 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:15:18 compute-0 nova_compute[182725]: 2026-01-22 23:15:18.882 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:22 compute-0 nova_compute[182725]: 2026-01-22 23:15:22.134 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:23 compute-0 nova_compute[182725]: 2026-01-22 23:15:23.886 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:27 compute-0 nova_compute[182725]: 2026-01-22 23:15:27.136 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:28 compute-0 nova_compute[182725]: 2026-01-22 23:15:28.889 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:29 compute-0 podman[248450]: 2026-01-22 23:15:29.149717918 +0000 UTC m=+0.075220738 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 23:15:32 compute-0 nova_compute[182725]: 2026-01-22 23:15:32.139 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:32 compute-0 nova_compute[182725]: 2026-01-22 23:15:32.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:33 compute-0 nova_compute[182725]: 2026-01-22 23:15:33.929 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:35 compute-0 podman[248471]: 2026-01-22 23:15:35.137759381 +0000 UTC m=+0.066902313 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Jan 22 23:15:35 compute-0 podman[248470]: 2026-01-22 23:15:35.250937815 +0000 UTC m=+0.185366747 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 23:15:37 compute-0 nova_compute[182725]: 2026-01-22 23:15:37.140 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:38 compute-0 nova_compute[182725]: 2026-01-22 23:15:38.932 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:41 compute-0 nova_compute[182725]: 2026-01-22 23:15:41.902 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:41 compute-0 nova_compute[182725]: 2026-01-22 23:15:41.902 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 23:15:41 compute-0 nova_compute[182725]: 2026-01-22 23:15:41.919 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 23:15:42 compute-0 nova_compute[182725]: 2026-01-22 23:15:42.141 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:43 compute-0 nova_compute[182725]: 2026-01-22 23:15:43.975 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:47 compute-0 nova_compute[182725]: 2026-01-22 23:15:47.142 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:49 compute-0 nova_compute[182725]: 2026-01-22 23:15:49.031 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:49 compute-0 podman[248515]: 2026-01-22 23:15:49.129310344 +0000 UTC m=+0.058495785 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:15:49 compute-0 podman[248513]: 2026-01-22 23:15:49.149009871 +0000 UTC m=+0.073488646 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 23:15:49 compute-0 podman[248514]: 2026-01-22 23:15:49.155300246 +0000 UTC m=+0.085625575 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 23:15:52 compute-0 nova_compute[182725]: 2026-01-22 23:15:52.144 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:53 compute-0 nova_compute[182725]: 2026-01-22 23:15:53.901 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:15:54 compute-0 nova_compute[182725]: 2026-01-22 23:15:54.034 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:57 compute-0 nova_compute[182725]: 2026-01-22 23:15:57.145 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:15:59 compute-0 nova_compute[182725]: 2026-01-22 23:15:59.037 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:00 compute-0 podman[248576]: 2026-01-22 23:16:00.167402644 +0000 UTC m=+0.097455117 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 23:16:01 compute-0 nova_compute[182725]: 2026-01-22 23:16:01.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:02 compute-0 nova_compute[182725]: 2026-01-22 23:16:02.146 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:03 compute-0 nova_compute[182725]: 2026-01-22 23:16:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:03 compute-0 nova_compute[182725]: 2026-01-22 23:16:03.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:16:03 compute-0 nova_compute[182725]: 2026-01-22 23:16:03.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:16:03 compute-0 nova_compute[182725]: 2026-01-22 23:16:03.905 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:16:03 compute-0 nova_compute[182725]: 2026-01-22 23:16:03.906 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:04 compute-0 nova_compute[182725]: 2026-01-22 23:16:04.040 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:05 compute-0 nova_compute[182725]: 2026-01-22 23:16:05.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:05 compute-0 nova_compute[182725]: 2026-01-22 23:16:05.888 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:16:06 compute-0 podman[248596]: 2026-01-22 23:16:06.136300203 +0000 UTC m=+0.067354014 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 22 23:16:06 compute-0 podman[248595]: 2026-01-22 23:16:06.168769655 +0000 UTC m=+0.102483061 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 23:16:07 compute-0 nova_compute[182725]: 2026-01-22 23:16:07.150 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:07 compute-0 nova_compute[182725]: 2026-01-22 23:16:07.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:07 compute-0 nova_compute[182725]: 2026-01-22 23:16:07.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:07 compute-0 nova_compute[182725]: 2026-01-22 23:16:07.919 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:16:07 compute-0 nova_compute[182725]: 2026-01-22 23:16:07.920 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:16:07 compute-0 nova_compute[182725]: 2026-01-22 23:16:07.920 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:16:07 compute-0 nova_compute[182725]: 2026-01-22 23:16:07.921 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.063 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.064 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5678MB free_disk=73.31585311889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.064 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.064 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.144 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.145 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.175 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.210 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.211 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:16:08 compute-0 nova_compute[182725]: 2026-01-22 23:16:08.211 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:16:09 compute-0 nova_compute[182725]: 2026-01-22 23:16:09.043 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:09 compute-0 ceilometer_agent_compute[192401]: 2026-01-22 23:16:09.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 23:16:11 compute-0 nova_compute[182725]: 2026-01-22 23:16:11.211 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:11 compute-0 nova_compute[182725]: 2026-01-22 23:16:11.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:12 compute-0 nova_compute[182725]: 2026-01-22 23:16:12.150 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:16:12.491 104215 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:16:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:16:12.491 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:16:12 compute-0 ovn_metadata_agent[104210]: 2026-01-22 23:16:12.491 104215 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:16:14 compute-0 nova_compute[182725]: 2026-01-22 23:16:14.093 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:17 compute-0 nova_compute[182725]: 2026-01-22 23:16:17.151 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:19 compute-0 nova_compute[182725]: 2026-01-22 23:16:19.097 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:20 compute-0 podman[248637]: 2026-01-22 23:16:20.140990409 +0000 UTC m=+0.070594624 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:16:20 compute-0 podman[248638]: 2026-01-22 23:16:20.140975289 +0000 UTC m=+0.063629862 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 23:16:20 compute-0 podman[248644]: 2026-01-22 23:16:20.157930717 +0000 UTC m=+0.065766244 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 23:16:22 compute-0 nova_compute[182725]: 2026-01-22 23:16:22.153 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:24 compute-0 nova_compute[182725]: 2026-01-22 23:16:24.100 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:27 compute-0 nova_compute[182725]: 2026-01-22 23:16:27.155 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:29 compute-0 nova_compute[182725]: 2026-01-22 23:16:29.103 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:30 compute-0 nova_compute[182725]: 2026-01-22 23:16:30.883 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:31 compute-0 podman[248702]: 2026-01-22 23:16:31.162241572 +0000 UTC m=+0.096692998 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 23:16:32 compute-0 nova_compute[182725]: 2026-01-22 23:16:32.156 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:34 compute-0 nova_compute[182725]: 2026-01-22 23:16:34.144 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:37 compute-0 podman[248724]: 2026-01-22 23:16:37.151992407 +0000 UTC m=+0.088808364 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 23:16:37 compute-0 podman[248725]: 2026-01-22 23:16:37.152514879 +0000 UTC m=+0.086126867 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, release=1755695350, version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41)
Jan 22 23:16:37 compute-0 nova_compute[182725]: 2026-01-22 23:16:37.157 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:39 compute-0 nova_compute[182725]: 2026-01-22 23:16:39.205 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:42 compute-0 nova_compute[182725]: 2026-01-22 23:16:42.159 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:44 compute-0 nova_compute[182725]: 2026-01-22 23:16:44.248 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:47 compute-0 nova_compute[182725]: 2026-01-22 23:16:47.162 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:49 compute-0 nova_compute[182725]: 2026-01-22 23:16:49.249 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:51 compute-0 podman[248772]: 2026-01-22 23:16:51.14673682 +0000 UTC m=+0.073407302 container health_status 08a00ed682da95767df6f79a928ea7549126de6a21442a47f4d1857064f8fe53 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 23:16:51 compute-0 podman[248774]: 2026-01-22 23:16:51.153182379 +0000 UTC m=+0.065905677 container health_status 86ba20c6d5281e8ca4b016c23f999b490e86b59a5ad75621200b7f563fe2de26 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 23:16:51 compute-0 podman[248773]: 2026-01-22 23:16:51.164638392 +0000 UTC m=+0.080893857 container health_status 7f2c183d163ccef92996d02fea9434ea4e6573bb262bf8b8aa278d3374fa9462 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 23:16:52 compute-0 nova_compute[182725]: 2026-01-22 23:16:52.163 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:54 compute-0 nova_compute[182725]: 2026-01-22 23:16:54.305 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:54 compute-0 nova_compute[182725]: 2026-01-22 23:16:54.900 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:16:56 compute-0 sshd-session[248836]: Accepted publickey for zuul from 192.168.122.10 port 39554 ssh2: ECDSA SHA256:0yTbYJhz5f6Cyfe1HnXPR8OQqXzYYr0qadY/nyC1tOk
Jan 22 23:16:56 compute-0 systemd-logind[801]: New session 50 of user zuul.
Jan 22 23:16:56 compute-0 systemd[1]: Started Session 50 of User zuul.
Jan 22 23:16:56 compute-0 sshd-session[248836]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 23:16:56 compute-0 sudo[248840]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 22 23:16:56 compute-0 sudo[248840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 23:16:57 compute-0 nova_compute[182725]: 2026-01-22 23:16:57.166 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:16:59 compute-0 nova_compute[182725]: 2026-01-22 23:16:59.307 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:17:01 compute-0 anacron[103311]: Job `cron.monthly' started
Jan 22 23:17:01 compute-0 anacron[103311]: Job `cron.monthly' terminated
Jan 22 23:17:01 compute-0 anacron[103311]: Normal exit (3 jobs run)
Jan 22 23:17:01 compute-0 ovs-vsctl[249012]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 23:17:01 compute-0 nova_compute[182725]: 2026-01-22 23:17:01.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:17:02 compute-0 podman[249108]: 2026-01-22 23:17:02.124085148 +0000 UTC m=+0.057665405 container health_status ffb5ba113256268d524385b205add519e9e7d1ee5750316d9f5d8f113324edf3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 23:17:02 compute-0 virtqemud[182297]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 23:17:02 compute-0 nova_compute[182725]: 2026-01-22 23:17:02.169 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:17:02 compute-0 virtqemud[182297]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 23:17:02 compute-0 virtqemud[182297]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 23:17:03 compute-0 crontab[249442]: (root) LIST (root)
Jan 22 23:17:03 compute-0 nova_compute[182725]: 2026-01-22 23:17:03.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:17:04 compute-0 nova_compute[182725]: 2026-01-22 23:17:04.309 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:17:04 compute-0 nova_compute[182725]: 2026-01-22 23:17:04.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:17:04 compute-0 nova_compute[182725]: 2026-01-22 23:17:04.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 23:17:04 compute-0 nova_compute[182725]: 2026-01-22 23:17:04.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 23:17:04 compute-0 nova_compute[182725]: 2026-01-22 23:17:04.904 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 23:17:05 compute-0 systemd[1]: Starting Hostname Service...
Jan 22 23:17:05 compute-0 systemd[1]: Started Hostname Service.
Jan 22 23:17:06 compute-0 nova_compute[182725]: 2026-01-22 23:17:06.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:17:06 compute-0 nova_compute[182725]: 2026-01-22 23:17:06.889 182729 DEBUG nova.compute.manager [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 23:17:07 compute-0 nova_compute[182725]: 2026-01-22 23:17:07.171 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 23:17:07 compute-0 podman[249788]: 2026-01-22 23:17:07.578247371 +0000 UTC m=+0.058632539 container health_status cf1e824b388c490b84b60dcd2706104b76a0f81941debac912d26b2be60b7393 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 22 23:17:07 compute-0 podman[249787]: 2026-01-22 23:17:07.597348313 +0000 UTC m=+0.080177601 container health_status c8575c2f0c06534ed4a369c84eb331acd838984b7ccb0074f32a7e516a4d3aa0 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1-f3fbfcfa92fd2002a770465144cd3b5845b450885b2815daf6e088f3614655c1'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 23:17:07 compute-0 nova_compute[182725]: 2026-01-22 23:17:07.889 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:17:08 compute-0 nova_compute[182725]: 2026-01-22 23:17:08.888 182729 DEBUG oslo_service.periodic_task [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 23:17:08 compute-0 nova_compute[182725]: 2026-01-22 23:17:08.908 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:17:08 compute-0 nova_compute[182725]: 2026-01-22 23:17:08.909 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:17:08 compute-0 nova_compute[182725]: 2026-01-22 23:17:08.910 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:17:08 compute-0 nova_compute[182725]: 2026-01-22 23:17:08.910 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.059 182729 WARNING nova.virt.libvirt.driver [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.061 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5368MB free_disk=73.02577590942383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.061 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.061 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.123 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.123 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.141 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing inventories for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.166 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating ProviderTree inventory for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.167 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Updating inventory in ProviderTree for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.182 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing aggregate associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.222 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Refreshing trait associations for resource provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.259 182729 DEBUG nova.compute.provider_tree [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7db789-7f4b-4901-9c88-ecf66d0aff43 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.318 182729 DEBUG nova.scheduler.client.report [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Inventory has not changed for provider 4f7db789-7f4b-4901-9c88-ecf66d0aff43 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.319 182729 DEBUG nova.compute.resource_tracker [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.320 182729 DEBUG oslo_concurrency.lockutils [None req-9e9d7d5a-0911-4db5-80d3-7753d88827e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 23:17:09 compute-0 nova_compute[182725]: 2026-01-22 23:17:09.353 182729 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
